• Hardware Platform (Jetson / GPU) Jetson NX
• DeepStream Version 6.1
• JetPack Version (valid for Jetson only) Jetpack 5.0.2 GA [L4T 35.1.0]
Hey all, I am trying to build a rather simple pipeline using deepstream, no deep learning involved. The goal is to have multiple cameras streaming and send the output remotely to another machine.
For the purpose I decided to use deepstream for the convenience of nvstreammux for batching the cameras and then nvmultistreamtiler to tile them in a single buffer, that I can then h264 encode and send over udp to the remote.
At this stage I want to make a pipeline that works for 1 stream, and then I will add more to the muxer, but I am having the following issue:
When I run this pipeline, it works as expected.
gst-launch-1.0 videotestsrc pattern=0 ! nvvidconv ! 'video/x-raw(memory:NVMM),width=320,height=240' ! queue ! m.sink_0 nvstreammux name=m width=320 height=240 batch-size=1 live-source=1 ! nvv4l2h264enc bitrate=8000000 insert-sps-pps=true ! rtph264pay mtu=1400 ! udpsink sync=false host=192.168.1.51 port=5004
But as soon as I add the nvmultistreamtiler
, only one frame gets sent to the remote and then the video freezes.
gst-launch-1.0 videotestsrc pattern=0 ! nvvidconv ! 'video/x-raw(memory:NVMM),width=320,height=240' ! queue ! m.sink_0 nvstreammux name=m width=320 height=240 batch-size=2 live-source=1 ! nvmultistreamtiler rows=1 columns=1 width=320 height=240 ! nvv4l2h264enc bitrate=8000000 insert-sps-pps=true ! rtph264pay mtu=1400 ! udpsink sync=false host=192.168.1.51 port=5004
I can’t seem to understand why. Please help me out here.
This is the pipeline I use on the remote to view the stream:
gst-launch-1.0.exe udpsrc port=5004 ! 'application/x-rtp, payload=127' ! rtph264depay ! h264parse ! decodebin ! videoconvert ! autovideosink sync=false