I’m working on an application where we need to stream several video cameras at high framerate.
However, I seem to be unable to get a stable stream. Our cameras are 2028x1520, and this test was done while streaming from 2 cameras at 100fps.
Here’s my test gstreamer pipeline for each camera (with a different sensor-id for the other camera):
gst-launch-1.0 nvarguscamerasrc sensor-id=0 \
! 'video/x-raw(memory:NVMM), width=(int)2028, height=(int)1520, format=(string)NV12, framerate=(fraction)100/1' \
! nvvidconv \
! omxh264enc bitrate=8000000 SliceIntraRefreshEnable=true \
! 'video/x-h264,stream-format=byte-stream' ! h264parse ! rtph264pay ! udpsink host=$IP port=$PORT
On the receive side (desktop Intel i7):
gst-launch-1.0 udpsrc port=$PORT \
! 'application/x-rtp, encoding-name=H264, payload=96' \
! rtph264depay \
! h264parse \
! avdec_h264 \
! xvimagesink
And here’s what the results look like:
It becomes better if I lower the bitrate, bu then the quality is lower than I’d like.
Please let me know if there’s something you see with the above that would be an issue, or if there’s something else I could try.
I have just tried with my Xavier running standard R31.1, and launched a pipeline encoding 3 streams and it works fine, even with motion as far as I can shake my Xavier, I can see all the 3 streams with very good quality from host (wired ethernet for xavier and host):
gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! 'video/x-raw(memory:NVMM), width=(int)2028, height=(int)1520, format=(string)NV12, framerate=(fraction)100/1' ! tee name=t \
! queue ! omxh264enc bitrate=8000000 SliceIntraRefreshEnable=true ! 'video/x-h264,stream-format=byte-stream' ! h264parse ! rtph264pay ! udpsink host=<HOST_IP> port=5000 \
t. ! queue ! omxh264enc bitrate=8000000 SliceIntraRefreshEnable=true ! 'video/x-h264,stream-format=byte-stream' ! h264parse ! rtph264pay ! udpsink host=<HOST_IP> port=5001 \
t. ! queue ! omxh264enc bitrate=8000000 SliceIntraRefreshEnable=true ! 'video/x-h264,stream-format=byte-stream' ! h264parse ! rtph264pay ! udpsink host=<HOST_IP> port=5002
I’m using a TX2 devkit camera module (OV5693) and nvpmodel MAXN, and haven’t boosted clocks for this test.
You may use tegrastats and check if you see some bottleneck, it may depend on the load from other software.
Can you give a try to running the pipeline above with no other software than Ubuntu running on Xavier ?
I tried running your pipeline, and with one receiving window open I get some grey frames, and all 3 windows open almost all are completely grey.
I also ran your pipeline and replaced
nvarguscamerasrc
with
videotestsrc
gst-launch-1.0 videotestsrc pattern=smpte ! 'video/x-raw, width=(int)2028, height=(int)1520, format=(string)I420, framerate=(fraction)100/1' ! tee name=t \
! queue ! omxh264enc bitrate=8000000 SliceIntraRefreshEnable=true ! 'video/x-h264,stream-format=byte-stream' ! h264parse ! rtph264pay ! udpsink host=${IP} port=5000 \
t. ! queue ! omxh264enc bitrate=8000000 SliceIntraRefreshEnable=true ! 'video/x-h264,stream-format=byte-stream' ! h264parse ! rtph264pay ! udpsink host=${IP} port=5001 \
t. ! queue ! omxh264enc bitrate=8000000 SliceIntraRefreshEnable=true ! 'video/x-h264,stream-format=byte-stream' ! h264parse ! rtph264pay ! udpsink host=${IP} port=5002
With videotestsrc I didn’t seem to have trouble with all 3 videos (tried white a few noise patterns).
I’m not sure what to make of that, does that mean that the camera drivers are an issue? Could you share which camera you’re using?
I will see if I can get a TX2 with it’s demo-camera and test if that looks/works any differently.
Let me know if you have any thoughts.
I’m using the camera module from TX2 devkit, it has a bayer sensor OV5693, and this module can be plugged into Xavier (no additional software, IIRC). This is what I’ve tried.
If you get a TX2 devkit, just unplug the camera module from it, plug into Xavier (be sure both devices are unpowered before doing this). Be sure the camera module is well seated.
The fact that it works fine with videotestsrc tends to suggest a problem related with your camera.
Does it improve if you boost your Jetson (nvpmodel MAXN and jetson_clocks.sh) or if you reduce your framerate ?
What gives:
sudo v4l2-ctl -d /dev/video0 --all
sudo media-ctl -p
What is your camera ? You may also tell if you know which drivers and device tree changes have been installed. I won’t be very helpful further for this, but from this information other users may advise further.
[EDIT: Just a thought, maybe not meaningful, but could you try with nvvidconv between videotestsrc and tee, so that the NVMM is used and so it uses the same path ?
gst-launch-1.0 videotestsrc pattern=smpte ! 'video/x-raw, width=(int)2028, height=(int)1520, format=(string)I420, framerate=(fraction)100/1' ! <b>nvvidconv ! 'video/x-raw(memory:NVMM), format=I420, framerate=100/1'</b> ! tee name=t \
! queue ! omxh264enc bitrate=8000000 SliceIntraRefreshEnable=true ! 'video/x-h264,stream-format=byte-stream' ! h264parse ! rtph264pay ! udpsink host=${IP} port=5000 \
t. ! queue ! omxh264enc bitrate=8000000 SliceIntraRefreshEnable=true ! 'video/x-h264,stream-format=byte-stream' ! h264parse ! rtph264pay ! udpsink host=${IP} port=5001 \
t. ! queue ! omxh264enc bitrate=8000000 SliceIntraRefreshEnable=true ! 'video/x-h264,stream-format=byte-stream' ! h264parse ! rtph264pay ! udpsink host=${IP} port=5002
Furthermore, I’d suggest trying NV12 first and if it works then try I420.]