Connect Jetson Nano to a laptop camera via RTP

Hello,

For a while I’ve been trying to connect the Jetson Nano module to a laptop camera, but with no results. My purpose is to make a detection program that can connect to an external camera, something like this Object detection using cellphone camera (as IP camera), a Jetson Nano and python code - YouTube, but I can’t find any explanations in this video and code is not working. Here is another reference AI on the Jetson Nano LESSON 62: Make a Streaming IP Camera from a Raspberry Pi Zero W - YouTube, but I don’t use a Raspberry Pi Zero W.

My program is written in python and I use opencv on output. Is working using a USB webcam.

I did my research a little with some changes, using videolan documentation and Webcam streaming throught VLC with YUY2 compatibility (because I received other decoding errors before), and the closest I got is to create a server on my laptop using (x.x.x.x laptop IP):

cvlc -vvv v4l2:///dev/video0:chroma=h264 --v4l2-width 640 --v4l2-height 480 --sout '#transcode{venc=x264{keyint=30},vcodec=h264,vb=10000,scale=0.5,fps=24,width=640,height=480,threads=4,high-priority=TRUE}:rtp{mux=ts,sdp=rtsp://x.x.x.x:8554/cam.sdp}'

Then on Jetson to test this settings:

video-viewer rtsp://x.x.x.x:8554/cam.sdp

But with lots of “late buffer for mux input”, laggy output connecting from other laptop, “video-viewer failed to capture video frame”.

I’m new doing this, but I really need it for a project of mine.

How should I solve the connection and use it on my code? Do I have to make the server on the Jetson Nano, then connect with my laptop to it?

Hi,
We would suggest use gstreamer. Please check if you can run this command successfully and see video preview:

$ gst-launch-1.0 uridecodebin uri='rtsp://x.x.x.x:8554/cam.sdp' ! nvoverlaysink

Ok, I got an output, thank you so much! But the stream is laggy, big delay, and some of the information appear distorted. On the laptop stream terminal outputs a message after each “fps filter debug: Resetting timestamps” saying "main mux warning: late buffer for mux input (and a number)"

How can I optimize my stream using this codec? Or should I change it?

The laptop I’m streaming from is a HP running an i3 4th generation processor. Do I need to use a more powerful laptop?

Hi,
Looks like you connect a USB camera to the laptop. Do you run Linux system on the laptop? If you run Linux, you can install gstreamer and try to launch RTSP server through test-launch. May refer to steps in Jetson Nano FAQ
Q: Is there any example of running RTSP streaming?

The nvv4l2h264enc plugin works on Jetson platforms. On the laptop, you can use x264enc like:

$ ./test-launch "videotestsrc ! x264enc ! h264parse ! rtph264pay name=pay0 pt=96"

And then try to receive/decode the stream on Jetson Nano.

If you laptop is Windows system, we don’t have much experience about setting up RTSP server on Windows. Would need other users to share experience.

Hello,

Yes, I run Linux on the laptop. I followed the steps on Jetson Nano FAQ and when I run:

$ ./test-launch “videotestsrc ! x264enc ! h264parse ! rtph264pay name=pay0 pt=96”

it outputs stream ready at rtsp://127.0.0.1:8554/test

Then on my Jetson:

$ gst-launch-1.0 uridecodebin uri=‘rtsp://<IP_HOST>:8554/test’ ! nvoverlaysink

I get:

Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://<IP_HOST>:8554/test
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
ERROR: from element /GstPipeline:pipeline0/GstURIDecodeBin:uridecodebin0/GstRTSPSrc:source: Unhandled error
Additional debug info:
gstrtspsrc.c(6161): gst_rtspsrc_send (): /GstPipeline:pipeline0/GstURIDecodeBin:uridecodebin0/GstRTSPSrc:source:
Service Unavailable (503)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

What am I doing wrong?

Note: IP_HOST is my laptop IP.

Hi,
Please check if the URI is valid:

gst-launch-1.0 rtspsrc location='rtsp://<IP_HOST>:8554/test' ! fakesink
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://<IP_HOST>:8554/test
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
ERROR: from element /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0: Unhandled error
Additional debug info:
gstrtspsrc.c(6161): gst_rtspsrc_send (): /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0:
Bad Request (400)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

And on laptop:

** (test-launch:4323): CRITICAL **: 09:59:21.267: could not parse launch syntax (?videotestsrc): empty pipeline not allowed

** (test-launch:4323): CRITICAL **: 09:59:21.267: could not create element

but the stream is still running.

Continuing the discussion:

I followed the Jetson Nano FAQ and I found this Running gstreamer on ubuntu sending video through RTSP is too slow - Stack Overflow. I followed the instructions there and I got an output screen. Then I checked GStreamer device monitors with the command line testing tool:

$ gst-device-monitor-1.0 

...
name  : Integrated Camera: Integrated C
	class : Video/Source
	caps  : video/x-raw, format=(string)YUY2, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)10/1;
	        video/x-raw, format=(string)YUY2, width=(int)960, height=(int)540, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)15/1;
	        video/x-raw, format=(string)YUY2, width=(int)848, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)20/1;
	        video/x-raw, format=(string)YUY2, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1;
	        video/x-raw, format=(string)YUY2, width=(int)640, height=(int)360, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1;
	        video/x-raw, format=(string)YUY2, width=(int)424, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1;
	        video/x-raw, format=(string)YUY2, width=(int)352, height=(int)288, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1;
	        video/x-raw, format=(string)YUY2, width=(int)320, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1;
	        video/x-raw, format=(string)YUY2, width=(int)320, height=(int)180, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1;
	        image/jpeg, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1;
	        image/jpeg, width=(int)960, height=(int)540, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1;
	        image/jpeg, width=(int)848, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1;
	        image/jpeg, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1;
	        image/jpeg, width=(int)640, height=(int)360, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1;
	        image/jpeg, width=(int)424, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1;
	        image/jpeg, width=(int)352, height=(int)288, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1;
	        image/jpeg, width=(int)320, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1;
	        image/jpeg, width=(int)320, height=(int)180, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1;
	properties:
		udev-probed = true
		device.bus_path = pci-0000:05:00.3-usb-0:3:1.0
		sysfs.path = /sys/devices/pci0000:00/0000:00:08.1/0000:05:00.3/usb1/1-3/1-3:1.0/video4linux/video0
		device.bus = usb
		device.subsystem = video4linux
		device.vendor.id = 5986
		device.vendor.name = "SunplusIT\\x20Inc"
		device.product.id = 212b
		device.product.name = "Integrated\ Camera:\ Integrated\ C"
		device.serial = SunplusIT_Inc_Integrated_Camera
		device.capabilities = :capture:
		device.api = v4l2
		device.path = /dev/video0
		v4l2.device.driver = uvcvideo
		v4l2.device.card = "Integrated\ Camera:\ Integrated\ C"
		v4l2.device.bus_info = usb-0000:05:00.3-3
		v4l2.device.version = 330518 (0x00050b16)
		v4l2.device.capabilities = 2225078273 (0x84a00001)
		v4l2.device.device_caps = 69206017 (0x04200001)
	gst-launch-1.0 v4l2src ! ...

After I did as @DaneLLL said and I added the preferred parameters:

./test-launch '( v4l2src device=/dev/video0 ! video/x-raw, format=(string)YUY2, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1 ! videoconvert ! queue ! x264enc speed-preset=ultrafast tune=zerolatency byte-stream=true threads=1 key-int-max=15 intra-refresh=true ! rtph264pay name=pay0 pt=96 )'

which works pretty well when I connect from my phone with VLC, but latency is a little big, video is clear but slow, and encoder returns an error saying:

x264 [error]: baseline profile doesn’t support 4:2:2

Encoder doesn’t support YUY2 what I’m thinking.

When I try to connect with Jetson:

$ gst-launch-1.0 uridecodebin uri=‘rtsp://<IP_HOST>:8554/test’ ! nvoverlaysink

Setting pipeline to PAUSED ...
Opening in BLOCKING MODE 
Pipeline is live and does not need PREROLL ...
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://<IP_HOST>:8554/test
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
Progress: (request) SETUP stream 0
Progress: (open) Opened Stream
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Progress: (request) Sending PLAY request
Progress: (request) Sending PLAY request
Progress: (request) Sent PLAY request
Opening in BLOCKING MODE 
NvMMLiteOpen : Block : BlockType = 261 
NVMEDIA: Reading vendor.tegra.display-size : status: 6 
NvMMLiteBlockCreate : Block : BlockType = 261 
NVMEDIA: NVMEDIABufferProcessing: 1507: NvMediaParserParse Unsupported Codec

What should I change to make it supported? And how can I implement that in a simple OpenCV python program?

Hi,
It looks like videoconvert plugin does not convert YUV422 to YUV420. Please try this string:

v4l2src device=/dev/video0 ! video/x-raw, format=(string)YUY2, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1 ! videoconvert ! video/x-raw,format=I420 ! queue ! x264enc speed-preset=ultrafast tune=zerolatency byte-stream=true threads=1 key-int-max=15 intra-refresh=true ! rtph264pay name=pay0 pt=96

Ok, the error disappeared. Connecting from other laptop or phone works well. Now I need to change the codec to make Jetson support it as you can see in the last few lines after trying to connect with my Jetson:

Opening in BLOCKING MODE 
NvMMLiteOpen : Block : BlockType = 261 
NVMEDIA: Reading vendor.tegra.display-size : status: 6 
NvMMLiteBlockCreate : Block : BlockType = 261 
NVMEDIA: NVMEDIABufferProcessing: 1507: NvMediaParserParse Unsupported Codec

How can I do that?

Hi,
Since hardware decoder does not support YUV422, you would need to convert to YUV420 and encode to h264/h265 stream.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.