Streaming B01 Dual Camera Image over Network to VLC

Hi everybody,

I am trying to capture a stereoimage using the Jetson Nano 4GB with two CSI PI NoIR V2 cameras and streaming the side-by-side image to a network client.
For capturing the Images and creating the side-by-side, I used the Jetson Hacks demo, which works very well.
No matter what I try I can not seem to get the stream to work. In the code below we tried to use GStreamer over a Rtsp Server. The ssue here is, I can lauch the script, access the stream via VLC from e.g. a mobile device, but the picture recieved is garbage. and is pixel mess. (8.5 KB)

Please find attached the uses script. Can you help me with this issue?Thank you.

Kind Regards

You may eliminate the network factor first. Please try the command on Jetson Nano:

$ gst-launch-1.0 uridecodebin uri='rtsp://' ! nvoverlaysink

Not sure if the port it 8554. Please change the RTSP URI accordingly.

And can try hardware encoder nvv4l2h264enc. And can try UDP streaming like:
Gstreamer TCPserversink 2-3 seconds latency - #5 by DaneLLL

Hi DaneLLL,
thanks for the quick response.
Executing this line:

The displayed result looks identical to what VLC recieves.
Does this give you additional clues?

I have not figured out how to include the hardeware codec in the launch string correctly, yet.

Have nice New Years Eve.

You may run test-launch to launch a RTSP server:

$ ./test-launch "nvarguscamerasrc ! nvv4l2h264enc ! h264parse ! rtph264pay name=pay0 pt=96"

And check if the issue is present. And confirm the camera frames are good by running the command to show preview:

$ gst-launch-1.0 nvarguscamerasrc ! nvoverlaysink

Hi DaneLLL,

your pipelines work :)
Checking the frames show them to be intact. I also checked the side-by-side via CV2Display window and they seem to be intact as well.

The test launch also workes fine. The system then steams with approcimately 0,5s of delay. I guess, there is an issue with the transfer of the appsrc to the pipeline. Any idea how to fix that?
Is there a methode to create the side-by-side image using nvcompositor in CUDA memory, without py-code? If I could call both cameras in one pipeline and compose the side-by-side, I might skip the openCV all togehter and just launch the RTSP server using the script.
I also ran across this post:

This is more or less exactlly the same application I want to create, except for the sink I my case is a Android mobile device.
Unfortunately the solution pipeline of this post does not work as well. When I run the reciever and the sender, the Reciever stays black and the sender pipeline closes right after the first frame stating “nvbuffer_composite Failed”.

In parallel, I checked the UDP examples. The test-programm was able to stream on the local host, with very low latency as expected. I was not capable to incooperate the methode into my code and make it stream to VLC.

Since my main sink will be an android device, running the pipeline was no option so far (I have no clue of SDK)
Streaming over UDP would be an option as well of cause, if I can inplement a sink on the Android phone.

Thanks again.
Kind Regards

Hi DaneLLL,

browsing the Forums let me come up with two working solutions.

Solution one:
Compose the two video feeds with nvcompositor and sent it via tcpserversink. The resulting pipeline that can be received with VLC and almost no latency is this one

$ gst-launch-1.0 nvarguscamerasrc sensor-id=0 sensor-mode=4 ! ‘video/x-raw(memory:NVMM), width=1280, height=(int)720, format=(string)NV12, framerate=(fraction)60/1’ ! nvvidconv flip-method=2 ! ‘video/x-raw(memory:NVMM), format=RGBA, width=640, height=360’ ! comp. nvarguscamerasrc sensor-id=1 sensor-mode=4 ! ‘video/x-raw(memory:NVMM), width=1280, height=(int)720, format=(string)NV12, framerate=(fraction)60/1’ ! nvvidconv flip-method=2 ! ‘video/x-raw(memory:NVMM), format=RGBA, width=640, height=360’ ! comp. nvcompositor name=comp sink_0::xpos=0 sink_0::ypos=0 sink_0::width=640 sink_0::height=360 sink_1::xpos=640 sink_1::ypos=0 sink_1::width=640 sink_1::height=360 ! nvvidconv ! ‘video/x-raw(memory:NVMM), format=(string)NV12, framerate=(fraction)30/1’ ! nvjpegenc ! tcpserversink port=5000 sync=0 async=false

Solution two:
I managed to fix the python code. It happened to be a very unfortunate oversight in the launch pipeline sting. Here the resolution needs to be adjusted to the format of the CV output and… violà. The RTSP stream works as good as the TCP one.

VLC can stream both versions very well, with buffer time beeing the limiting factor when talking lag.

Thasnks again for your support.
I hope this helps the community.