I have a custom carrier board for the X2 module we developed which provides 4x 1080p60 YUV 4:2:2 (UYVY) input feeds over MIPI-CSI to the X2 module. Our application requires gStreamer 1.11.0 to be used to interface with the encoding and nvidconv cores. We made a custom compile of gStreamer 1.11.0 using the ridgerun wiki:
We took the ov5693 driver and simply edited it remove all register reads/writes to the physical ov5693 part and added UYVY support. So we essentially have a dummy yuv 4:2:2 v4l2src distributing 1080p60 frames. I am not using Jetpack. I am using Linux for Tegra 28.2.1 with no GUI.
A single stream using this gst-launch pipeline works fine:
~/gst_1.11.0/out/bin/gst-launch-1.0 v4l2src device=/dev/video1 ! "video/x-raw, width=1920, height=1080, format=(string)UYVY, framerate=(fraction)60/1" ! nvvidconv ! 'video/x-raw(memory:NVMM), width=1920, height=1080, format=I420, framerate=60/1' ! omxh264enc bitrate=5000000 ! 'video/x-h264, stream-format=(string)byte-stream' ! h264parse ! mpegtsmux ! udpsink host=224.0.0.2 port=5068
There are no visual anomolies and the input holds to 60fps
Running 2 inputs using the following pipeline results in 60fps for both channels at the beginning, but after 20 seconds both drop to 40fps:
~/gst_1.11.0/out/bin/gst-launch-1.0 \
v4l2src device=/dev/video1 io-mode=2 do-timestamp=true ! "video/x-raw, width=1920, height=1080, format=(string)UYVY, framerate=(fraction)60/1" ! queue ! nvvidconv output-buffers=4 interpolation-method=5 ! 'video/x-raw(memory:NVMM), width=1920, height=1080, format=I420, framerate=60/1' ! queue ! omxh264enc bitrate=5000000 ! 'video/x-h264, stream-format=(string)byte-stream' ! queue ! h264parse ! queue ! mpegtsmux ! queue ! udpsink host=224.0.0.2 port=5068 \
v4l2src device=/dev/video2 io-mode=2 do-timestamp=true ! "video/x-raw, width=1920, height=1080, format=(string)UYVY, framerate=(fraction)60/1" ! queue ! nvvidconv output-buffers=4 interpolation-method=5 ! 'video/x-raw(memory:NVMM), width=1920, height=1080, format=I420, framerate=60/1' ! queue ! omxh264enc bitrate=5000000 ! 'video/x-h264, stream-format=(string)byte-stream' ! queue ! h264parse ! queue ! mpegtsmux ! queue ! udpsink host=224.0.0.2 port=6068
Adding a 3rd results in 20fps.
yavta & v4l2-ctrl both report that our input sensors are running at 60fps, even when i open 3 instances.
I added more output-buffers to the nvidconv element and it actually then started streaming at 60fps, but the video had a bunch of corruption issues. There was a bunch of tearing and occasionally the video would actually skip back a few frames. It is almost as if the buffer of video frames is a ring buffer falling behind, but i dont understand the under workings.
I changed my pipeline to use videoconvert instead of nvidconv, but even with one stream it sits at 30fps. I do not know why as using a videotestsrc at 60 fps, it streams at 60fps.
Lastly i setup a videotestsrc 1920x1080@60 UYVY with nvvidconv and ran 4 of them. As long as i kept the default pattern on the test src, everything streamed perfect. Here is that pipeline:
~/gst_1.11.0/out/bin/gst-launch-1.0 \
videotestsrc is-live=true do-timestamp=true pattern=0 ! "video/x-raw, width=1920, height=1080, format=(string)UYVY, framerate=(fraction)60/1" ! queue ! nvvidconv output-buffers=60 ! 'video/x-raw(memory:NVMM), width=1920, height=1080, format=I420, framerate=60/1' ! queue ! omxh264enc bit-packetization=true bitrate=5000000 ! 'video/x-h264, stream-format=(string)byte-stream' ! queue ! h264parse ! mpegtsmux ! udpsink host=224.0.0.2 port=5068 \
videotestsrc is-live=true do-timestamp=true pattern=0 ! "video/x-raw, width=1920, height=1080, format=(string)UYVY, framerate=(fraction)60/1" ! queue ! nvvidconv output-buffers=60 ! 'video/x-raw(memory:NVMM), width=1920, height=1080, format=I420, framerate=60/1' ! queue ! omxh264enc bit-packetization=true bitrate=5000000 ! 'video/x-h264, stream-format=(string)byte-stream' ! queue ! h264parse ! mpegtsmux ! udpsink host=224.0.0.2 port=6068 \
videotestsrc is-live=true do-timestamp=true pattern=0 ! "video/x-raw, width=1920, height=1080, format=(string)UYVY, framerate=(fraction)60/1" ! queue ! nvvidconv output-buffers=60 ! 'video/x-raw(memory:NVMM), width=1920, height=1080, format=I420, framerate=60/1' ! queue ! omxh264enc bit-packetization=true bitrate=5000000 ! 'video/x-h264, stream-format=(string)byte-stream' ! queue ! h264parse ! mpegtsmux ! udpsink host=224.0.0.2 port=7068 \
videotestsrc is-live=true do-timestamp=true pattern=0 ! "video/x-raw, width=1920, height=1080, format=(string)UYVY, framerate=(fraction)60/1" ! queue ! nvvidconv output-buffers=60 ! 'video/x-raw(memory:NVMM), width=1920, height=1080, format=I420, framerate=60/1' ! queue ! omxh264enc bit-packetization=true bitrate=5000000 ! 'video/x-h264, stream-format=(string)byte-stream' ! queue ! h264parse ! mpegtsmux ! udpsink host=224.0.0.2 port=8068
If i change the pattern i get issues with performance but i believe that is due to the CPU generating these images and is falling behind.
Any ideas? Since videotestsrc is working, that seems to imply the pipeline should work. Is there some issue with my setup and v4l2src interfacing w/ everything else? I believe its using io-mode mmap.
Thanks in advance!