I want to overlay a image (from a .png or .jpeg file) on video frame with gstreamer. My idea is to write a gstreamer plugin that load the image file using opencv and convert to NvBuffer(dmabuf_fd). Then I could call NvBufferComposite() to composte the video stream from gstreamer pipeline (e.g. nvv4l2camerasrc) and image together.
However, I don’t know how to convert cv::mat to NvBuffer. Or could I load the image ,decoder it to RGBA nvbuffer directly? Any help to correct this would be appreciated.
It is supported to decode JPEG into NvBuffer. Please refer to
There is no hardware decoder for PNG file, you would need to use software decoder and call Raw2NvBuffer().
All NvBuffer APIs are in
If you use Jetpack 5, Please use NvBufSurface APIs instead of NvBuffer APIs.
Sorry, i have another question if I can composite two dma fd with different pixfmt(e.g. one is YUV420M and another is I420) using NvBufferComposite().
The format of each NvBuffer should be identical. You can convert to same format by calling NvBufferTransfom().
Per our understanding, YUV420M and I420 are NvBufferColorFormat_YUV420. It looks identical.
This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.