Multimedia API pixel format conversion issues

I am using V4L2 to capture frames into DMA buffers created by NvBufferCreateEx() and NvBufferMemMap(), then rendering them by creating an EGLImageKHR through NvEGLImageFromFd() and using OpenGL ES. I have gotten it to work in some cases, though not in an ideal way. The main reason I am using the multimedia API here is for the low latency that the DMA buffers provide.

My issues all seem to fall to pixel format. The multimedia API does not support my camera’s pixel format ( greyscale ), but I was able to work around that by modifying the kernel to put the incoming MIPI sensor data into a 32 bit buffer. Then I created the DMA buffer as ARGB32 so that NvEGLImageFromFd() did not modify the sensor data, allowing me to extract the real data in the GLSL fragment shader.

Now I am using a usb camera. This camera outputs in a few different YUV formats and resolutions. I have tested the camera in Gstreamer and OpenCV and it renders correctly, but when I try it using the multimedia API it does not convert from YUV to RGB correctly. It does work if the I set the driver resolution as 1280x1024. I am a bit baffled here and have more or less ruled out my code as the cause.

  1. Is there a way to go from DMA buffer to GLSL shader with the raw data, skipping the automatic conversion?
  2. Are there any known bugs with the conversion from NV12/YV12/YU12 to RGBA?

Hi,
We have a sample of launching USB cameras 12_camera_v4l2_cuda. GREY format is supported. Please take a look.

I’m looking at the code for that sample application, and it does not appear to support greyscale. I am using Jetpack 4.2. What version supports greyscale? Regardless, the USB camera is a YUV format.

My other (MIPI) sensor is greyscale, but the bit depth is probably higher than NvBuffers support. For example I noticed that the documentation for the L4T Multimedia API 32.2 has NvBufferColorFormat_GRAY8, which would not work. This enum is also missing from my nvbuf_utils.h header so it is a moot point.

My probably seems to lie with the conversion from NvBuffer to EglKHRImage using NvEGLImageFromFd().

Hi,
The support is from Jetpack4.2.1(r32.2). You may use sdkmanager to install tegra_multimedia_api samples matching with system image.