When trying to extract the Y plane from the YCbCr Image I am seeing nothing but corrupted images. The buffer being mapped to CPU with iimage->mapBuffer(0) is larger than the stream resolution of my image. Is the pixel layout different than what I found here on https://en.wikipedia.org/wiki/YUV#Y%E2%80%B2UV420sp_(NV21)to_RGB_conversion(Android)?
Which platform you’re using? please provide more details of BSP, sensor information.
the wiki-page you referred to is the standard picture format on “Android” camera preview.
please note that, JetPack release were based-on L4T, the default color format is NvBufferColorFormat_YUV420.
please refer to Multimedia API Sample Applications for the samples to access your camera sensor, you could also modify the code to have implementation.
besides your platform/ BSP/ sensor information, may I know what’s the purpose to extract Y plan?
I am using the Jetson AGX Xavier, L4T 31.1, JetPack 4.1 with a Leopard Imaging IMX477 mipi module and carrier board. I will also be trying to use a TX2 with L4T 28.2 and the same camera. I am using libargus to capture frames from the module.
At the time, the use was to view the greyscale image, but the longer goal is to convert from NvBufferColorFormat_YUV420 to RGB32. I am currently using 07_video_convert as a basis after your recommendation.