I’m working on an application where we need to stream several video cameras at high framerate.
However, I seem to be unable to get a stable stream. Our cameras are 2028x1520, and this test was done while streaming from 2 cameras at 100fps.
Here’s my test gstreamer pipeline for each camera (with a different sensor-id for the other camera):
gst-launch-1.0 nvarguscamerasrc sensor-id=0 \ ! 'video/x-raw(memory:NVMM), width=(int)2028, height=(int)1520, format=(string)NV12, framerate=(fraction)100/1' \ ! nvvidconv \ ! omxh264enc bitrate=8000000 SliceIntraRefreshEnable=true \ ! 'video/x-h264,stream-format=byte-stream' ! h264parse ! rtph264pay ! udpsink host=$IP port=$PORT
On the receive side (desktop Intel i7):
gst-launch-1.0 udpsrc port=$PORT \ ! 'application/x-rtp, encoding-name=H264, payload=96' \ ! rtph264depay \ ! h264parse \ ! avdec_h264 \ ! xvimagesink
And here’s what the results look like:
It becomes better if I lower the bitrate, bu then the quality is lower than I’d like.
Please let me know if there’s something you see with the above that would be an issue, or if there’s something else I could try.