The ACCELERATED GSTREAMER USER GUIDE for 28.2.1 states:
Note: nvcompositor supports video decode (gst-omx) with the overlay render pipeline for gst-1.8.3.
This is a bit confusing to me; does that statement mean it must be used with nvoverlaysink?
If not, I’m interested in how to use nvcompositor with a live camera source, omxh264enc and a udp stream.
We only verify/support
omxh264dec/omxh265dec ! nvcompositor ! nvoverlaysink
It does not work if you don’t link omxh264dec/omxh265dec to nvcompositor.
A relative post: https://devtalk.nvidia.com/default/topic/1037041/
That is rather limiting; it would be great to use the compositor to composite two camera feeds, or possibly imagery from an appsrc. Are there any plans to extend this functionality in the future?
Suggest you check tegra_multimedia_api. Please refer to