I want to overlay a image (from a .png or .jpeg file) on video frame with gstreamer. My idea is to write a gstreamer plugin that load the image file using opencv and convert to NvBuffer(dmabuf_fd). Then I could call NvBufferComposite() to composte the video stream from gstreamer pipeline (e.g. nvv4l2camerasrc) and image together.
However, I don’t know how to convert cv::mat to NvBuffer. Or could I load the image ,decoder it to RGBA nvbuffer directly? Any help to correct this would be appreciated.