Is there one method to display 2 usb camera streams on one hdmi device using gstream?

I used the AGX and Xavier NX to get the video from two Logitech Webcam usb cameras and detect the objects in the videos with opencv using yolov5 model ,then I want to display the two streams in the same hdmi device.

I know I can use the nvdrmvideosink to playback one usb stream(I not work in desktop environment, I want show the video directly after the machine boot), but how to display two usb videos in one hdmi screen? Is there one way to split the screen , such as some parameters to control the display positions for any one stream?

Thanks.

You would use a compositor. gstreamer provides software based plugin compositor, and NVIDIA provides HW accelerated plugin nvcompositor. For the latter I’d suggest to start with RGBA format :

stream1 ! nvvidconv ! video/x-raw(memory:NVMM),format=RGBA ! queue ! comp.sink_0  
stream2 ! nvvidconv ! video/x-raw(memory:NVMM),format=RGBA ! queue ! comp.sink_1
nvcompositor name=comp sink_1::xpos=960 ! nvvidconv ! nvdrmvideosink 

Searching this forum for nvcompositor you would find several examples.

It is not easy for me, because the two frame streams are handled by two processes in my python program.

I’m afraid what you want will not be easy because of that. You may consider redesigning in multi-thread using the same process memory space.

If you can’t do this, I see 2 possible workarounds without a GUI.

  1. If your monitor supports PIP mode and has both a HDMI and a DP inputs, you may split monitor into 2 displays (one for HDMI and one for DP, Xavier-NX devkit has both connectors, use 2 cables), then use nvdrmvideosink property conn-id for displaying frames into the wanted display. Not sure if the monitor will properly keep that PIP/inputs config upon wake up, though.

  2. Second workaround would be using shmsink/shmsrc for interprocess. You would do something like:

Process1 → queue → shmsink socket-path=./app1
Process2 → queue → shmsink socket-path=./app2

Or:
Process1 → identity drop-allocation=1 → shmsink socket-path=./app1
Process2 → identity drop-allocation=1 → shmsink socket-path=./app2

Then read both inputs and compose:

gst-launch-1.0  shrmrc socket-path=./app1 do-timestamp=1 ! nvvidconv ! 'video/x-raw(memory:NVMM),format=RGBA' ! queue ! comp.sink_0     shrmrc socket-path=./app2  do-timestamp=1 ! nvvidconv ! 'video/x-raw(memory:NVMM),format=RGBA' ! queue ! comp.sink_1     nvcompositor name=comp ! 'video/x-raw(memory:NVMM),format=BGRA' ! nvvidconv ! nvdrmvideosink

Be aware that this may cost CPU usage… and you may have to manage the deletion of sockets (producer would only clean if the consumer no longer uses it…May be better creating your named sockets beforehand, not tested though).

Thanks, I will have a try later.
Why there is no sink in AGX Grin , such like nvoverlaysink ,which is very fit to my request?

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.