How to convert OpenCV mat to NvBuffer(dmabuf_fd)

Hi,

I want to overlay a image (from a .png or .jpeg file) on video frame with gstreamer. My idea is to write a gstreamer plugin that load the image file using opencv and convert to NvBuffer(dmabuf_fd). Then I could call NvBufferComposite() to composte the video stream from gstreamer pipeline (e.g. nvv4l2camerasrc) and image together.
However, I don’t know how to convert cv::mat to NvBuffer. Or could I load the image ,decoder it to RGBA nvbuffer directly? Any help to correct this would be appreciated.

Thanks.

Hi,
It is supported to decode JPEG into NvBuffer. Please refer to

/usr/src/jetson_multimedia_api/samples/06_jpeg_decode

There is no hardware decoder for PNG file, you would need to use software decoder and call Raw2NvBuffer().

All NvBuffer APIs are in

/usr/src/jetson_multimedia_api/include/nvbuf_utils.h

If you use Jetpack 5, Please use NvBufSurface APIs instead of NvBuffer APIs.

Fine, I would try this.

Thanks

Sorry, i have another question if I can composite two dma fd with different pixfmt(e.g. one is YUV420M and another is I420) using NvBufferComposite().

Thanks.

Hi,
The format of each NvBuffer should be identical. You can convert to same format by calling NvBufferTransfom().

Per our understanding, YUV420M and I420 are NvBufferColorFormat_YUV420. It looks identical.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.