NVMM Allocation and unexhaustive pixel format in NvBuffer API

Hello nvidia forum !

I’m currently working on a project were we’re developping custom cameras, and exploit them through a Xavier AGX’s PCIe. For that matter we are building a V4L2 driver and a gstreamer plugin inspired by the gstnvv4l2camerasrc sample.

The idea behind those choices is to use NvBuffers into our DMABUF transfers: so we can get our frames directly into GPU memory (NVMM) and leverage physical encoders and cuda features without any copy. This is kind of a strong requirement since our projected use cases will most certainly need all the bandwith Xavier has to offer.

We’re concerned about the few pixel formats proposed in NvBufferCreate API (nvbuffer_utils.h). Those are far from being exhaustive and we’d like to cover some that are not listed in enum NvBufferColorFormat. Some cameras might only be able to support a particular format, in this case we won’t be able to get optimal perfomances on video transfers / processings.

Do those limitations come from hardware constraints ? The Technical Reference Manual lists a whole bunch of Pixel formats in “Table 7.39 Pixel Format Support”. How can we work with those? What is the NvBufferColorFormat purpose in NvBufferParams ? Is it supposed to carry data description for the libraries / gstreamer plugins to rely on for their image processing ?

Is there a way to allocate GPU memory without being constrained by this NvBufferColorFormat enumeration, specify our own pixel format during allocation and then do what we need with cuda or other transfromation / encoding etc… tools or libraries ?

Best regards,

Jean-Christophe BEGUE

Could you share which format you need? We would need to check if the format is support on Jetson platforms. Certain format such as BGR is not supported and you would need to capture the frames in CUDA buffer first and then convert to the defined NvBuffer format.


Thanks for the response.

Here are the formats we’d need (doesn’t include those that are already available):

  • I420
  • RGBA
  • ARGB
  • YUY2

Also, my question was about how the NvBufferColorFormat is used throughout the Nvidia Gstreamer ecosystem: does nvvidconv plugin relies on it, or does it use the Gstreamer caps?

Else, where can I find documentation about the relations between NvBuffers and the provided plugins ? We plan to use DeepStream and we’re seeking for resources to understand how to build around it.

Using CUDA buffers then converting them looks promising, but will we be able to use the provided Gstreamer plugins with it ? Is there a documentation or an example about that ?

For DeepStream SDK, please take a look at
DeepStream Development Guide — DeepStream DeepStream Version: 5.0 documentation (nvidia.com)

After the installation, the samples are in


In DeepStream SDK, it NvBufSurface APIs which are unified on Jetson platforms and desktop GPU. Please check the source code and README in