Nvv4l2decoder does not work with certain rtsp video stream

Summary
I am working with a Tesla T4 and the Deepstream 6.0.1 official container.
My pipeline consists of multiple input cameras, an object detector, secondary models, tracker and some probe to retrieve metadata. The pipeline works fine with the majority of the cameras. However, I did find an rtsp video stream that can’t be processed by the pipeline. I believe the issue is caused by the nvv4l2decoder component.
Details
The attached code create the components used to decode an rtsp video camera stream. It works with the majority of the cameras, however I found a camera that cannot be processed by the pipeline (nothing happens) when using nvv4l2decoder.
Note that if I replace the nvv4l2decoder with a avdec_h264 component, the pipeline works fine for every camera. This would prove that the issue is caused by nvv4l2decoder . However using avdec_h264 is not a solution, as I need to use nvv4l2decoder to decode video using the gpu.
Attached you can find:

  • The code to create the bin that decode the rtsp video streams
  • One pdf with the pipeline graph for the pipeline running fine on a camera
  • One pdf with the pipeline graph for the pipeline not running fine on the problematic camera

Note that following this link Issues decoding RTSP stream using nvv4l2decoder with Jetpack 4.4 I already tried to remove the component h264_parse from the pipeline, but this didn’t solve the issue.

Note also the problematic camera can be streamed correctly using OpenCV and other tools. In fact, as explained above, the pipeline works with every camera using avdec_h264.

pipeline_not_working.pdf (42.9 KB)
pipeline_working.pdf (41.8 KB)
rtsp_source_bin.py (7.0 KB)

Thank you for your help!

Hi @mfoglio
1.Could you try the following pipeline to show the failure information to us?
gst-launch-1.0 --gst-debug=v4l2videodec:5 rtspsrc location=rtsp://XXX protocols=4 ! rtph264depay ! h264parse ! nvv4l2decoder ! fakesink
2.Could you get your rtsp stream and attach to us?
Thanks

Hi @yuweiw , thank you for your reply. I will share everything with you tomorrow.

Hello @yuweiw , here’s the information you asked.

Pipeline
Command:

gst-launch-1.0 --gst-debug=v4l2videodec:5 rtspsrc location=rtsp://localhost/cbb48b92-f74d-4ad5-b8a9-3affbefcc17e_default/2faf8bbe-f526-48a8-89bc-a7431f60834e protocols=4 ! rtph264depay ! h264parse ! nvv4l2decoder ! fakesink

Output:

0:00:00.019013274   396 0x559a6952d300 DEBUG           v4l2videodec gstv4l2videodec.c:1740:gst_v4l2_video_dec_sink_getcaps:<nvv4l2decoder0> Returning sink caps image/jpeg, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; video/x-h264, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ], stream-format=(string){ byte-stream }, alignment=(string){ au }; video/x-h265, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ], stream-format=(string){ byte-stream }, alignment=(string){ au }; video/mpeg, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ], mpegversion=(int)4, systemstream=(boolean)false, parsed=(boolean)true; video/mpeg, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ], mpegversion=(int)[ 1, 2 ], systemstream=(boolean)false, parsed=(boolean)true; video/x-divx, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ], divxversion=(int)[ 4, 5 ]; video/x-vp8, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; video/x-vp9, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]
0:00:00.019115290   396 0x559a6952d300 DEBUG           v4l2videodec gstv4l2videodec.c:1740:gst_v4l2_video_dec_sink_getcaps:<nvv4l2decoder0> Returning sink caps image/jpeg, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; video/x-h264, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ], stream-format=(string){ byte-stream }, alignment=(string){ au }; video/x-h265, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ], stream-format=(string){ byte-stream }, alignment=(string){ au }; video/mpeg, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ], mpegversion=(int)4, systemstream=(boolean)false, parsed=(boolean)true; video/mpeg, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ], mpegversion=(int)[ 1, 2 ], systemstream=(boolean)false, parsed=(boolean)true; video/x-divx, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ], divxversion=(int)[ 4, 5 ]; video/x-vp8, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; video/x-vp9, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]
0:00:00.019151517   396 0x559a6952d300 DEBUG           v4l2videodec gstv4l2videodec.c:1716:gst_v4l2_video_dec_src_query:<nvv4l2decoder0> Returning src caps video/x-raw(memory:NVMM), width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]
0:00:00.019175626   396 0x559a6952d300 DEBUG           v4l2videodec gstv4l2videodec.c:1716:gst_v4l2_video_dec_src_query:<nvv4l2decoder0> Returning src caps video/x-raw(memory:NVMM), width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]
Setting pipeline to PAUSED ...
0:00:00.019241825   396 0x559a6952d300 DEBUG           v4l2videodec gstv4l2videodec.c:565:gst_v4l2_video_dec_open:<nvv4l2decoder0> Opening
0:00:00.064544468   396 0x559a6952d300 DEBUG           v4l2videodec gstv4l2videodec.c:647:gst_v4l2_video_dec_start:<nvv4l2decoder0> Starting
Pipeline is live and does not need PREROLL ...
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://localhost/cbb48b92-f74d-4ad5-b8a9-3affbefcc17e_default/2faf8bbe-f526-48a8-89bc-a7431f60834e
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
Progress: (request) SETUP stream 0
Progress: (open) Opened Stream
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Progress: (request) Sending PLAY request
Progress: (request) Sending PLAY request

0:00:03.692429416   396 0x7fe1680075e0 DEBUG           v4l2videodec gstv4l2videodec.c:1740:gst_v4l2_video_dec_sink_getcaps:<nvv4l2decoder0> Returning sink caps video/x-h264, width=(int)[ 1, 32768 ], height=(int)[ 1, 32768 ], framerate=(fraction)[ 0/1, 2147483647/1 ], stream-format=(string)byte-stream, alignment=(string)au, parsed=(boolean)true
0:00:03.692555984   396 0x7fe1680075e0 DEBUG           v4l2videodec gstv4l2videodec.c:1740:gst_v4l2_video_dec_sink_getcaps:<nvv4l2decoder0> Returning sink caps video/x-h264, width=(int)[ 1, 32768 ], height=(int)[ 1, 32768 ], framerate=(fraction)[ 0/1, 2147483647/1 ], stream-format=(string)byte-stream, alignment=(string)au, parsed=(boolean)true
0:00:03.692630633   396 0x7fe1680075e0 DEBUG           v4l2videodec gstv4l2videodec.c:1740:gst_v4l2_video_dec_sink_getcaps:<nvv4l2decoder0> Returning sink caps video/x-h264, width=(int)[ 1, 32768 ], height=(int)[ 1, 32768 ], framerate=(fraction)[ 0/1, 2147483647/1 ], stream-format=(string)byte-stream, alignment=(string)au, parsed=(boolean)true
0:00:03.692665349   396 0x7fe1680075e0 DEBUG           v4l2videodec gstv4l2videodec.c:704:gst_v4l2_video_dec_set_format:<nvv4l2decoder0> Setting format: video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, parsed=(boolean)true
0:00:03.694768156   396 0x7fe1680075e0 DEBUG           v4l2videodec gstv4l2videodec.c:1740:gst_v4l2_video_dec_sink_getcaps:<nvv4l2decoder0> Returning sink caps video/x-h264, width=(int)3840, height=(int)2160, framerate=(fraction)0/1, stream-format=(string)byte-stream, alignment=(string)au, parsed=(boolean)true, interlace-mode=(string)progressive, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, profile=(string)high, level=(string)5.1
0:00:03.694805890   396 0x7fe1680075e0 DEBUG           v4l2videodec gstv4l2videodec.c:704:gst_v4l2_video_dec_set_format:<nvv4l2decoder0> Setting format: video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, width=(int)3840, height=(int)2160, framerate=(fraction)0/1, interlace-mode=(string)progressive, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, parsed=(boolean)true, profile=(string)high, level=(string)5.1
0:00:03.695454075   396 0x7fe1680075e0 DEBUG           v4l2videodec gstv4l2videodec.c:1349:gst_v4l2_video_dec_handle_frame:<nvv4l2decoder0> Handling frame 0
0:00:03.695467086   396 0x7fe1680075e0 DEBUG           v4l2videodec gstv4l2videodec.c:1389:gst_v4l2_video_dec_handle_frame:<nvv4l2decoder0> Sending header

Note that the pipeline gets stuck right after starting. No further output is generated in the console.
I also used ffprobe to make sure the camera was up:

ffprobe rtsp://localhost/cbb48b92-f74d-4ad5-b8a9-3affbefcc17e_default/2faf8bbe-f526-48a8-89bc-a7431f60834e
ffprobe version 3.4.8-0ubuntu0.2 Copyright (c) 2007-2020 the FFmpeg developers
  built with gcc 7 (Ubuntu 7.5.0-3ubuntu1~18.04)
  configuration: --prefix=/usr --extra-version=0ubuntu0.2 --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --enable-gpl --disable-stripping --enable-avresample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librubberband --enable-librsvg --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-omx --enable-openal --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-libopencv --enable-libx264 --enable-shared
  libavutil      55. 78.100 / 55. 78.100
  libavcodec     57.107.100 / 57.107.100
  libavformat    57. 83.100 / 57. 83.100
  libavdevice    57. 10.100 / 57. 10.100
  libavfilter     6.107.100 /  6.107.100
  libavresample   3.  7.  0 /  3.  7.  0
  libswscale      4.  8.100 /  4.  8.100
  libswresample   2.  9.100 /  2.  9.100
  libpostproc    54.  7.100 / 54.  7.100
[rtsp @ 0x55785053c080] method SETUP failed: 461 Unsupported transport
Input #0, rtsp, from 'rtsp://localhost/cbb48b92-f74d-4ad5-b8a9-3affbefcc17e_default/2faf8bbe-f526-48a8-89bc-a7431f60834e':
  Duration: N/A, start: 0.034000, bitrate: N/A
    Stream #0:0: Video: h264 (High), yuvj420p(pc, bt709, progressive), 3840x2160, 30 tbr, 90k tbn, 180k tbc

Video
The rtsp stream is local. I used the following command to dump the rtsp stream into a file:

ffmpeg -i rtsp://localhost/cbb48b92-f74d-4ad5-b8a9-3affbefcc17e_default/2faf8bbe-f526-48a8-89bc-a7431f60834e -acodec copy -vcodec copy video.mp4

I am going to send to you the file in a few minutes.
Thank you for your help!

I just shared the video privately with you. Let me know if you need anything else. Thanks!

Hi, I have received the video you attached. When I use the pipeline below, it works well.

You can try it in your env.

gst-launch-1.0 -e filesrc location=video.mp4 ! qtdemux ! h264parse ! nvv4l2decoder ! queue2 ! nvv4l2h264enc ! h264parse ! qtmux ! filesink location='test-nvv4l2decoder.mp4'

If it works well in your env, nvv4l2decoder may not be the reason.

Also, you can try the below pipeline in your env and check if it works well:

GST_DEBUG=3 gst-launch-1.0  rtspsrc location=rtsp://localhost/cbb48b92-f74d-4ad5-b8a9-3affbefcc17e_default/2faf8bbe-f526-48a8-89bc-a7431f60834e protocols=4 ! rtph264depay ! h264parse ! nvv4l2decoder ! queue2 ! nvv4l2h264enc ! h264parse ! qtmux ! filesink location='test-nvv4l2decoder.mp4'

Hi @yuweiw , I tried to run your second pipeline with rtspsrc video but it does not work. I’ll share with you the problematic rtsp video stream url as soon as I can.

Hello again @yuweiw . I have just shared with you an rtsp url to reproduce the issue. You should find it in your messages. Thank you!

Hi @mfoglio , from your rtsp url, I can duplicate your issue. It is not the decoder error, it’s an compatibility issue with the stream data. The nvv4l2decoder may not recognize the date format you sended.We suggest you use uridecoderbin to test your rtsp source. It’s more simple and have better compatibility.
You can try the pipeline below in you env.

gst-launch-1.0 uridecodebin uri=XXX ! nvvideoconvert ! fakesink

Hi @yuweiw , thank you for your response. It seems that the following works fine:

gst-launch-1.0  uridecodebin uri=xxx ! filesink location='test-nvv4l2decoder.mp4

I’ll try to replace uridecodebin in my pipeline and see if it works. I see no reasons why it wouldn’t, but let me try it first before closing the thread.

I also have another related question to completely solve the issue. Originally, I was using a custom bin instead of uridecodebin because I could decode a video using the CPU instead of the NVDEC chip simply by replacing nvv4l2decoder with avdec_h264. Do you know if there is a way to force uridecodebin to decode the video using the hardware I want (CPU vs NVDEC)?
I’d like to do that because we are decoding a lot of cameras. The GPU can decode many, but not all of them. So I’d like to use the CPU to decode some streams. Suppose that I want to process 40 1080p video streams: I would decode 30 using the NVDEC chip on the GPU, and 10 using the CPU. I already have the code to do that. I just need to know how I can force uridecodebin to use the NVDEC chip or the CPU.

Hello @yuweiw , thank you again for your reply.
I modified my pipeline to use uridecodebin. This fixes the issues with the rtsp stream that I sent you in a private message.
However, now there is another rtsp source, that previously worked with my pipeline, but that now does not work with uridecodebin. I’ll share with you the rtsp privately.

Thank you

Hi, @mfoglio Please submmit a new topic about the new rtsp source cannot play with uridecoderbin to avoid confusion and assign to me.
You can close this issue and I will check the new rtsp source in a new topic.
Thanks a lot

Hello @yuweiw , thank your reply. I opened a new thread here: Uridecodebin cannot decode an RTSP video stream

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.