Creating RTSP Stream With v4l2src

I’ve read through a few of the RTSP forum posts and followed the steps to install gst-rtsp-server. I’m running the test-launch script now to create a server from which I want to sample 2 streams (of different sizes). For the high resolution stream I want to display it directly using a videosink, while for the low resolution stream, I want to sample frames using a OpenCV VideoCapture object, run some processing, and overlay some info on the larger stream using the gstreamer videoverlay.

I’m stuck on the first step, of streaming my v4l2src on an RTSP and reading it via an rtspsrc. This is what I’ve tried:
Server Side:

./test-launch "v4l2src device="/dev/video2" ! video/x-raw, width=2560, height=720, bitrate=1000000 ! omxh264enc ! video/x-h264, profile=baseline ! rtph264pay name=pay0 pt=96"

Client Side:

gst-launch-1.0 rtspsrc location="rtsp://127.0.0.1:8554/test" latency=0 ! rtph264depay ! h264parse ! omxh264dec ! xvimagesink

I get the following output:

Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://127.0.0.1:8554/test
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
ERROR: from element /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0: Unhandled error
Additional debug info:
gstrtspsrc.c(6161): gst_rtspsrc_send (): /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0:
Service Unavailable (503)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

Hi,
Do you run both server and client commands on same device? If you run server command on one device and client command on the other, you would need to connect to the IP of the server instead of 127.0.0.1.

Yes they are both on the same device on different terminals

Hi,
You may try the example in
Jetson Nano FAQ
Q: Is there any example of running RTSP streaming?

When I run the example I get :

Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://127.1.1.0:8554/test
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
Progress: (request) SETUP stream 0
Progress: (open) Opened Stream
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Progress: (request) Sending PLAY request
Progress: (request) Sending PLAY request
Progress: (request) Sent PLAY request
Opening in BLOCKING MODE 
NvMMLiteOpen : Block : BlockType = 261 
NVMEDIA: Reading vendor.tegra.display-size : status: 6 
NvMMLiteBlockCreate : Block : BlockType = 261 

but nothing opens. I think my nvoverlaysink plugin doesn’t work, how would I try it on a different sink like xvimagesink?

This looks wrong… bitrate is for encoded fromats such as H264. Raw video has framerate instead.

Assuming your V4L camera can do 30 fps with this 2560x720 resolution, you would try:

./test-launch "v4l2src device=/dev/video2 do-timestamp=1 ! video/x-raw, width=2560, height=720, framerate=30/1 ! nvvidconv ! nvv4l2h264enc insert-vui=1 insert-sps-pps=1 ! h264parse ! rtph264pay name=pay0"

And you may check RTSP client from localhost with:

gst-launch-1.0 rtspsrc location=rtsp://127.0.0.1:8554/test latency=500 ! application/x-rtp,media=video,encoding-name=H264 ! rtph264depay ! h264parse ! nvv4l2decoder ! nvvidconv ! xvimagesink

If this doesn’t work, please post the modes available from your camera with:

v4l2-ctl -d2 --list-formats-ext 

Thanks this works! I tried changing the stream to 60 FPS however and the quality dropped significantly. The camera can handle 60 FPS at 2560x720 in other pipelines without any drop in quality. Is there any adjustments that can be made to achieve the same quality and framerate via RTSP?

Here is the output from the command:

ioctl: VIDIOC_ENUM_FMT
	Index       : 0
	Type        : Video Capture
	Pixel Format: 'YUYV'
	Name        : YUYV 4:2:2
		Size: Discrete 2560x720
			Interval: Discrete 0.017s (60.000 fps)
			Interval: Discrete 0.033s (30.000 fps)
			Interval: Discrete 0.067s (15.000 fps)
		Size: Discrete 1344x376
			Interval: Discrete 0.010s (100.000 fps)
			Interval: Discrete 0.017s (60.000 fps)
			Interval: Discrete 0.033s (30.000 fps)
			Interval: Discrete 0.067s (15.000 fps)
		Size: Discrete 3840x1080
			Interval: Discrete 0.033s (30.000 fps)
			Interval: Discrete 0.067s (15.000 fps)
		Size: Discrete 4416x1242
			Interval: Discrete 0.067s (15.000 fps)

Is there a way to modify the bitrate because all the conversion is being done locally on my Jetson?

I just noticed I also get this error when I run the rtspsrc cmd:

(gst-launch-1.0:14923): GStreamer-CRITICAL **: 16:11:08.718: gst_mini_object_unref: assertion 'mini_object != NULL' failed
NvMMLiteOpen : Block : BlockType = 261 
NVMEDIA: Reading vendor.tegra.display-size : status: 6 
NvMMLiteBlockCreate : Block : BlockType = 261 
reference in DPB was never decoded

I solved the resolution issue by encoding in H265 instead of H264:

  ./test-launch "v4l2src device=/dev/video2 ! video/x-raw, width=2560, height=720, framerate=60/1 ! nvvidconv ! nvv4l2h265enc maxperf-enable=1 ! h265parse ! rtph265pay name=pay0"
 gst-launch-1.0 rtspsrc location=rtsp://127.0.0.1:8554/test latency=0 ! rtph265depay ! h265parse ! nvv4l2decoder ! nvvidconv ! xvimagesink sync=0

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.