RTSP in Deepstream

Please provide complete information as applicable to your setup.

• Hardware Platform (GPU)
• DeepStream Version 5.0
• JetPack Version (valid for Jetson only)- NOT applicable
• TensorRT Version-7.0
• NVIDIA GPU Driver Version (valid for GPU only)-440.33
**• Issue Type- Can’t get rtsp stream working with nvv4l2decoder; Goal is to run inference on RTSP stream via deepstream pipeline. Pipeline is working well with stored .mp4 video files but but with RTSP streams. **
**• How to reproduce the issue ? – 1st host a valid RTSP, and on the client side run the following: No error is displayed , no video is displayed.
A) gst-launch-1.0 rtspsrc location=rtsp://192.168.0.13/screenlive latency=100 ! queue ! rtph264depay ! nvv4l2decoder ! nvvideoconvert ! autovideosink
Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://192.168.0.13/screenlive
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
Progress: (request) SETUP stream 0
Progress: (request) SETUP stream 1
Progress: (open) Opened Stream
Setting pipeline to PLAYING …
New clock: GstSystemClock
Progress: (request) Sending PLAY request
Progress: (request) Sending PLAY request
Progress: (request) Sent PLAY request

B) gst-launch-1.0 rtspsrc location=rtsp://192.168.0.13/screenlive latency=100 ! queue ! rtph264depay ! nvv4l2decoder ! nvvideoconvert ! nvdsosd ! nveglglessink
Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Got context from element ‘eglglessink0’: gst.egl.EGLDisplay=context, display=(GstEGLDisplay)NULL;
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://192.168.0.13/screenlive
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
Progress: (request) SETUP stream 0
Progress: (request) SETUP stream 1
Progress: (open) Opened Stream
Setting pipeline to PLAYING …
New clock: GstSystemClock
Progress: (request) Sending PLAY request
Progress: (request) Sending PLAY request
Progress: (request) Sent PLAY request
^Chandling interrupt.
Interrupt: Stopping pipeline …
Execution ended after 0:00:02.634273550
Setting pipeline to PAUSED …
Setting pipeline to READY …
^C

There is no video output when the pipeline is executed, there is no ERROR either.
**

1 Like

We can not reproduce the problem with your pipeline.

Can you try the following command to get more log and information?

gst-launch-1.0 --gst-debug=v4l2videodec:5,nveglglessink:5 rtspsrc location=rtsp://192.168.0.13/screenlive latency=100 ! queue ! rtph264depay ! nvv4l2decoder ! nvvideoconvert ! nvdsosd ! nveglglessink

Hey Fiona, thanks for prompt reply. Here is the result for the debug command::

gst-launch-1.0 --gst-debug=v4l2videodec:5,nveglglessink:5 rtspsrc location=rtsp://192.168.0.13/screenlive latency=100 ! queue ! rtph264depay ! nvv4l2decoder ! nvvideoconvert ! nvdsosd ! nveglglessink
0:00:00.053829751 6540 0x561d569b4010 DEBUG v4l2videodec gstv4l2videodec.c:1669:gst_v4l2_video_dec_sink_getcaps: Returning sink caps image/jpeg, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; video/x-h264, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ], stream-format=(string){ byte-stream }, alignment=(string){ au }; video/x-h265, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ], stream-format=(string){ byte-stream }, alignment=(string){ au }; video/mpeg, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ], mpegversion=(int)4, systemstream=(boolean)false, parsed=(boolean)true; video/mpeg, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ], mpegversion=(int)[ 1, 2 ], systemstream=(boolean)false, parsed=(boolean)true
0:00:00.053876613 6540 0x561d569b4010 DEBUG v4l2videodec gstv4l2videodec.c:1669:gst_v4l2_video_dec_sink_getcaps: Returning sink caps image/jpeg, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; video/x-h264, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ], stream-format=(string){ byte-stream }, alignment=(string){ au }; video/x-h265, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ], stream-format=(string){ byte-stream }, alignment=(string){ au }; video/mpeg, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ], mpegversion=(int)4, systemstream=(boolean)false, parsed=(boolean)true; video/mpeg, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ], mpegversion=(int)[ 1, 2 ], systemstream=(boolean)false, parsed=(boolean)true
0:00:00.053893493 6540 0x561d569b4010 DEBUG v4l2videodec gstv4l2videodec.c:1645:gst_v4l2_video_dec_src_query: Returning src caps video/x-raw(memory:NVMM), width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]
0:00:00.053925692 6540 0x561d569b4010 DEBUG v4l2videodec gstv4l2videodec.c:1645:gst_v4l2_video_dec_src_query: Returning src caps video/x-raw(memory:NVMM), width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]
0:00:00.053952229 6540 0x561d569b4010 DEBUG v4l2videodec gstv4l2videodec.c:1645:gst_v4l2_video_dec_src_query: Returning src caps video/x-raw(memory:NVMM), width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]
0:00:00.053978687 6540 0x561d569b4010 DEBUG v4l2videodec gstv4l2videodec.c:1645:gst_v4l2_video_dec_src_query: Returning src caps video/x-raw(memory:NVMM), width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]
0:00:00.054003795 6540 0x561d569b4010 DEBUG v4l2videodec gstv4l2videodec.c:1645:gst_v4l2_video_dec_src_query: Returning src caps video/x-raw(memory:NVMM), width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]
0:00:00.054040483 6540 0x561d569b4010 DEBUG v4l2videodec gstv4l2videodec.c:1645:gst_v4l2_video_dec_src_query: Returning src caps video/x-raw(memory:NVMM), width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]
Setting pipeline to PAUSED …
0:00:00.058944005 6540 0x561d569b4010 DEBUG v4l2videodec gstv4l2videodec.c:510:gst_v4l2_video_dec_open: Opening
0:00:00.059579786 6540 0x561d569b4010 DEBUG nveglglessink ext/eglgles/gsteglglessink.c:773:gst_eglglessink_start: Starting
0:00:00.059623208 6540 0x561d569b4010 DEBUG nveglglessink ext/eglgles/gsteglglessink.c:825:gst_eglglessink_start: Started
0:00:00.059680999 6540 0x561d564dccf0 DEBUG nveglglessink ext/eglgles/gsteglglessink.c:590:render_thread_func: posting ENTER stream status
0:00:00.343715563 6540 0x561d569b4010 DEBUG v4l2videodec gstv4l2videodec.c:591:gst_v4l2_video_dec_start: Starting
Pipeline is live and does not need PREROLL …
Got context from element ‘eglglessink0’: gst.egl.EGLDisplay=context, display=(GstEGLDisplay)NULL;
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://192.168.0.13/screenlive
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
Progress: (request) SETUP stream 0
Progress: (request) SETUP stream 1
Progress: (open) Opened Stream
Setting pipeline to PLAYING …
New clock: GstSystemClock
Progress: (request) Sending PLAY request
Progress: (request) Sending PLAY request
Progress: (request) Sent PLAY request
0:00:00.749986370 6540 0x561d564dcc00 DEBUG v4l2videodec gstv4l2videodec.c:1669:gst_v4l2_video_dec_sink_getcaps: Returning sink caps video/x-h264, width=(int)[ 1, 32768 ], height=(int)[ 1, 32768 ], framerate=(fraction)[ 0/1, 2147483647/1 ], stream-format=(string)byte-stream, alignment=(string)au, parsed=(boolean)true
0:00:00.750395170 6540 0x561d564dcc00 DEBUG v4l2videodec gstv4l2videodec.c:1669:gst_v4l2_video_dec_sink_getcaps: Returning sink caps video/x-h264, width=(int)[ 1, 32768 ], height=(int)[ 1, 32768 ], framerate=(fraction)[ 0/1, 2147483647/1 ], stream-format=(string)byte-stream, alignment=(string)au, parsed=(boolean)true
0:00:00.750473371 6540 0x561d564dcc00 DEBUG v4l2videodec gstv4l2videodec.c:648:gst_v4l2_video_dec_set_format: Setting format: video/x-h264, stream-format=(string)byte-stream, alignment=(string)au
0:00:00.757371992 6540 0x561d564dcc00 DEBUG v4l2videodec gstv4l2videodec.c:1277:gst_v4l2_video_dec_handle_frame: Handling frame 0
0:00:00.757413301 6540 0x561d564dcc00 DEBUG v4l2videodec gstv4l2videodec.c:1318:gst_v4l2_video_dec_handle_frame: Sending header

There is no frame data received. Can you try a larger latency value? e.g. 2000. If it does not work, please check your rtsp server. It is not caused by deepstream.

I checked RTSP on VLC media player which works fine, and increasing the latency(2000, 50000) didn’t worked either.

So you need to debug RTSP. VLC has it’s own error resilience implementation, it dose not mean there is no problem with the RTSP server.

There are some 3rd party rtsp debug tools. E.G. wireshark (https://www.wireshark.org/download.html) which can help to analysis the rtsp requests and response.

1 Like

same rtsp link is working flawlessly with the below mentioned pipeline:

gst-launch-1.0 rtspsrc location=rtsp://192.168.0.13/screenlive latency=100 ! queue ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! videoscale ! video/x-raw,width=640,height=480 ! autovideosink

AND

gst-launch-1.0 rtspsrc location=rtsp://192.168.0.13/screenlive latency=100 ! queue ! rtph264depay ! h264parse ! avdec_h264 ! nvvideoconvert ! videoscale ! video/x-raw,width=640,height=480 ! nveglglessink

its just when “nvv4l2decoder” is used there is not output for this RTSP stream.

What platform are you using? Can the sample h264 file be decoded with nvv4l2decoder?

gst-launch-1.0 --gst-debug= v4l2videodec:5 filesrc location=/opt/nvidia/deepstream/deepstream-5.0/samples/streams/sample_720p.h264 ! h264parse ! nvv4l2decoder ! nvvideoconvert ! nveglglessink

I am using Ubuntu18 , and yes sample h264 file is decoded with nvv4l2decoder. as mentioned above gst-launch-1.0 rtspsrc location=rtsp://192.168.0.13/screenlive latency=100 ! queue ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! videoscale ! video/x-raw,width=640,height=480 ! autovideosink replacing avdec_h264 with nvv4l2decoder stops everything.

There is no update from you for a period, assuming this is not an issue any more.
Hence we are closing this topic. If need further support, please open a new one.
Thanks

Are you using Jetson or dGPU? We can not reproduce your problem with the command lines you give so we don’t know what happens in your device. Can you provide useful information which can guarantee that we can reproduce the problem?

Hardware Platform (Jetson / GPU):
DeepStream Version:
JetPack Version (valid for Jetson only):
TensorRT Version:
NVIDIA GPU Driver Version (valid for GPU only):
Issue Type( questions, new requirements, bugs):
How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)