Uridecodebin cannot decode an RTSP video stream

Hello,

I am working on a Deepstream pipeline but I have troubles decoding an RTSP video stream. This stream can be decoded with VLC, ffplay, and OpenCV.

Summary
I am working on a pipeline to run AI on multiple live cameras using Deepstream 6.0.1, a Tesla T4 and the Deepstream Python APIs.

My custom pipeline worked well with the majority of cameras. However, I found a camera that could not be decoded by my pipeline (see file attached to that thread for further info): thanks to the help of Nvidia forum, it has been discovered that the camera sent timestamp in a format that could not be digested by nvv4l2decoder. The solution, proposed here https://forums.developer.nvidia.com/t/nvv4l2decoder-does-not-work-with-certain-rtsp-video-stream/ , was to use the uridecodebin to decode the specific camera. However, uridecodebin cannot decode other streams that were previously working fine with my pipeline.
Goal
My goal is to develop a pipeline that can decode any rtsp video using NVDEC chip. All the video streams used for testing can be decoded fine with OpenCV, or using a pipeline on the CPU (see file attached here ). Therefore, I assume there must be a way to decode them using NVDEC.
Debug the issue
To reproduce the issue you can use:

gst-launch-1.0  uridecodebin uri=xxx ! filesink location='test-nvv4l2decoder.mp4

I already shared the rtsp video stream privately with @yuweiw .

Thank you for your help!

Hi, @mfoglio
===>However, uridecodebin cannot decode other streams that were previously working fine with my pipeline.

Do you use pipeline below that can work well?

gst-launch-1.0 --gst-debug=v4l2videodec:5  rtspsrc location=XXX  ! rtph264depay ! h264parse ! nvv4l2decoder ! fakesink

I use this pipeline, it cannot work well with the new rtsp source you send to me. So could you give me the pipeline that can work well in your env with the new rtsp src? Thanks

Hi @yuweiw , the following will work:

gst-launch-1.0 --gst-debug=v4l2videodec:5 rtspsrc location=$RTSP_STREAM  protocols=tcp latency=1000 drop-on-latency=1 timeout=5000000 ! rtph264depay ! h264parse ! nvv4l2decoder cudadec-memtype=2 num-extra-surfaces=1 ! queue leaky=2 max-size-buffers=1 ! nvvideoconvert nvbuf-memory-type=3 output-buffers=1 ! capsfilter caps=video/x-raw,format=RGBA ! fakesink

This the output I have:

0:00:00.066831278 12552 0x5627407b18f0 DEBUG           v4l2videodec gstv4l2videodec.c:1740:gst_v4l2_video_dec_sink_getcaps:<nvv4l2decoder0> Returning sink caps image/jpeg, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; video/x-h264, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ], stream-format=(string){ byte-stream }, alignment=(string){ au }; video/x-h265, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ], stream-format=(string){ byte-stream }, alignment=(string){ au }; video/mpeg, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ], mpegversion=(int)4, systemstream=(boolean)false, parsed=(boolean)true; video/mpeg, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ], mpegversion=(int)[ 1, 2 ], systemstream=(boolean)false, parsed=(boolean)true; video/x-divx, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ], divxversion=(int)[ 4, 5 ]; video/x-vp8, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; video/x-vp9, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]
0:00:00.066910670 12552 0x5627407b18f0 DEBUG           v4l2videodec gstv4l2videodec.c:1740:gst_v4l2_video_dec_sink_getcaps:<nvv4l2decoder0> Returning sink caps image/jpeg, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; video/x-h264, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ], stream-format=(string){ byte-stream }, alignment=(string){ au }; video/x-h265, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ], stream-format=(string){ byte-stream }, alignment=(string){ au }; video/mpeg, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ], mpegversion=(int)4, systemstream=(boolean)false, parsed=(boolean)true; video/mpeg, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ], mpegversion=(int)[ 1, 2 ], systemstream=(boolean)false, parsed=(boolean)true; video/x-divx, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ], divxversion=(int)[ 4, 5 ]; video/x-vp8, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; video/x-vp9, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]
0:00:00.066935338 12552 0x5627407b18f0 DEBUG           v4l2videodec gstv4l2videodec.c:1716:gst_v4l2_video_dec_src_query:<nvv4l2decoder0> Returning src caps video/x-raw(memory:NVMM), width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]
0:00:00.066956286 12552 0x5627407b18f0 DEBUG           v4l2videodec gstv4l2videodec.c:1716:gst_v4l2_video_dec_src_query:<nvv4l2decoder0> Returning src caps video/x-raw(memory:NVMM), width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]
0:00:00.066973847 12552 0x5627407b18f0 DEBUG           v4l2videodec gstv4l2videodec.c:1716:gst_v4l2_video_dec_src_query:<nvv4l2decoder0> Returning src caps video/x-raw(memory:NVMM), width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]
0:00:00.067026633 12552 0x5627407b18f0 DEBUG           v4l2videodec gstv4l2videodec.c:1716:gst_v4l2_video_dec_src_query:<nvv4l2decoder0> Returning src caps video/x-raw(memory:NVMM), width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]
0:00:00.067069773 12552 0x5627407b18f0 DEBUG           v4l2videodec gstv4l2videodec.c:1716:gst_v4l2_video_dec_src_query:<nvv4l2decoder0> Returning src caps video/x-raw(memory:NVMM), width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]
0:00:00.067109192 12552 0x5627407b18f0 DEBUG           v4l2videodec gstv4l2videodec.c:1716:gst_v4l2_video_dec_src_query:<nvv4l2decoder0> Returning src caps video/x-raw(memory:NVMM), width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]
0:00:00.067147533 12552 0x5627407b18f0 DEBUG           v4l2videodec gstv4l2videodec.c:1716:gst_v4l2_video_dec_src_query:<nvv4l2decoder0> Returning src caps video/x-raw(memory:NVMM), width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]
0:00:00.067183587 12552 0x5627407b18f0 DEBUG           v4l2videodec gstv4l2videodec.c:1716:gst_v4l2_video_dec_src_query:<nvv4l2decoder0> Returning src caps video/x-raw(memory:NVMM), width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]
Setting pipeline to PAUSED ...
0:00:00.067265417 12552 0x5627407b18f0 DEBUG           v4l2videodec gstv4l2videodec.c:565:gst_v4l2_video_dec_open:<nvv4l2decoder0> Opening
0:00:00.068350077 12552 0x5627407b18f0 DEBUG           v4l2videodec gstv4l2videodec.c:647:gst_v4l2_video_dec_start:<nvv4l2decoder0> Starting
Pipeline is live and does not need PREROLL ...
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://......................
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
Progress: (request) SETUP stream 0
Progress: (open) Opened Stream
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Progress: (request) Sending PLAY request
Progress: (request) Sending PLAY request
Progress: (request) Sent PLAY request
0:00:01.391966005 12552 0x7f8934037190 DEBUG           v4l2videodec gstv4l2videodec.c:1740:gst_v4l2_video_dec_sink_getcaps:<nvv4l2decoder0> Returning sink caps video/x-h264, width=(int)[ 1, 32768 ], height=(int)[ 1, 32768 ], framerate=(fraction)[ 0/1, 2147483647/1 ], stream-format=(string)byte-stream, alignment=(string)au, parsed=(boolean)true
0:00:01.392131583 12552 0x7f8934037190 DEBUG           v4l2videodec gstv4l2videodec.c:1740:gst_v4l2_video_dec_sink_getcaps:<nvv4l2decoder0> Returning sink caps video/x-h264, width=(int)[ 1, 32768 ], height=(int)[ 1, 32768 ], framerate=(fraction)[ 0/1, 2147483647/1 ], stream-format=(string)byte-stream, alignment=(string)au, parsed=(boolean)true
0:00:01.392273472 12552 0x7f8934037190 DEBUG           v4l2videodec gstv4l2videodec.c:1740:gst_v4l2_video_dec_sink_getcaps:<nvv4l2decoder0> Returning sink caps video/x-h264, width=(int)[ 1, 32768 ], height=(int)[ 1, 32768 ], framerate=(fraction)[ 0/1, 2147483647/1 ], stream-format=(string)byte-stream, alignment=(string)au, parsed=(boolean)true
0:00:01.392325174 12552 0x7f8934037190 DEBUG           v4l2videodec gstv4l2videodec.c:704:gst_v4l2_video_dec_set_format:<nvv4l2decoder0> Setting format: video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, parsed=(boolean)true
0:00:01.393229369 12552 0x7f8934037190 DEBUG           v4l2videodec gstv4l2videodec.c:1740:gst_v4l2_video_dec_sink_getcaps:<nvv4l2decoder0> Returning sink caps video/x-h264, width=(int)800, height=(int)600, framerate=(fraction)13500000/465517, stream-format=(string)byte-stream, alignment=(string)au, parsed=(boolean)true, interlace-mode=(string)progressive, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, profile=(string)high, level=(string)3.1
0:00:01.393260420 12552 0x7f8934037190 DEBUG           v4l2videodec gstv4l2videodec.c:704:gst_v4l2_video_dec_set_format:<nvv4l2decoder0> Setting format: video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, width=(int)800, height=(int)600, framerate=(fraction)13500000/465517, interlace-mode=(string)progressive, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, parsed=(boolean)true, profile=(string)high, level=(string)3.1
0:00:01.393883495 12552 0x7f8934037190 DEBUG           v4l2videodec gstv4l2videodec.c:1349:gst_v4l2_video_dec_handle_frame:<nvv4l2decoder0> Handling frame 0
0:00:01.393895854 12552 0x7f8934037190 DEBUG           v4l2videodec gstv4l2videodec.c:1389:gst_v4l2_video_dec_handle_frame:<nvv4l2decoder0> Sending header
0:00:01.619232787 12552 0x7f8934037190 DEBUG           v4l2videodec gstv4l2videodec.c:1482:gst_v4l2_video_dec_handle_frame:<nvv4l2decoder0> Acquired caps: video/x-raw, format=(string)NV12, width=(int)800, height=(int)600, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)1:3:5:1, framerate=(fraction)0/1
0:00:01.619419480 12552 0x7f8934037190 DEBUG           v4l2videodec gstv4l2videodec.c:1489:gst_v4l2_video_dec_handle_frame:<nvv4l2decoder0> Available caps: image/jpeg, framerate=(fraction)[ 0/1, 2147483647/1 ], width=(int)800, height=(int)608, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string){ bt601, smpte240m, bt709, 2:4:5:2, 2:4:5:3, 1:4:7:1, 2:4:7:1, 2:4:12:8, bt2020, 2:0:0:0 }; video/x-raw, format=(string)NV12, framerate=(fraction)[ 0/1, 2147483647/1 ], width=(int)800, height=(int)608, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, colorimetry=(string){ bt601, smpte240m, bt709, 2:4:5:2, 2:4:5:3, 1:4:7:1, 2:4:7:1, 2:4:12:8, bt2020, 2:0:0:0 }; video/mpeg, mpegversion=(int)4, systemstream=(boolean)false, framerate=(fraction)[ 0/1, 2147483647/1 ], width=(int)800, height=(int)608, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string){ bt601, smpte240m, bt709, 2:4:5:2, 2:4:5:3, 1:4:7:1, 2:4:7:1, 2:4:12:8, bt2020, 2:0:0:0 }; video/mpeg, mpegversion=(int)2, framerate=(fraction)[ 0/1, 2147483647/1 ], width=(int)800, height=(int)608, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string){ bt601, smpte240m, bt709, 2:4:5:2, 2:4:5:3, 1:4:7:1, 2:4:7:1, 2:4:12:8, bt2020, 2:0:0:0 }; video/x-vp9, framerate=(fraction)[ 0/1, 2147483647/1 ], width=(int)800, height=(int)608, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string){ bt601, smpte240m, bt709, 2:4:5:2, 2:4:5:3, 1:4:7:1, 2:4:7:1, 2:4:12:8, bt2020, 2:0:0:0 }; video/x-vp8, framerate=(fraction)[ 0/1, 2147483647/1 ], width=(int)800, height=(int)608, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string){ bt601, smpte240m, bt709, 2:4:5:2, 2:4:5:3, 1:4:7:1, 2:4:7:1, 2:4:12:8, bt2020, 2:0:0:0 }; video/x-h265, stream-format=(string)byte-stream, alignment=(string)au, framerate=(fraction)[ 0/1, 2147483647/1 ], width=(int)800, height=(int)608, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string){ bt601, smpte240m, bt709, 2:4:5:2, 2:4:5:3, 1:4:7:1, 2:4:7:1, 2:4:12:8, bt2020, 2:0:0:0 }; video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, framerate=(fraction)[ 0/1, 2147483647/1 ], width=(int)800, height=(int)608, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string){ bt601, smpte240m, bt709, 2:4:5:2, 2:4:5:3, 1:4:7:1, 2:4:7:1, 2:4:12:8, bt2020, 2:0:0:0 }
0:00:01.619459139 12552 0x7f8934037190 DEBUG           v4l2videodec gstv4l2videodec.c:1497:gst_v4l2_video_dec_handle_frame:<nvv4l2decoder0> Filtered caps: EMPTY
0:00:01.619554930 12552 0x7f8934037190 DEBUG           v4l2videodec gstv4l2videodec.c:1514:gst_v4l2_video_dec_handle_frame:<nvv4l2decoder0> Possible decoded caps: video/x-raw(memory:NVMM), width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ], format=(string){ I420, NV12, P010_10LE, BGRx, RGBA, GRAY8, GBR }
0:00:01.619570253 12552 0x7f8934037190 DEBUG           v4l2videodec gstv4l2videodec.c:1523:gst_v4l2_video_dec_handle_frame:<nvv4l2decoder0> Chosen decoded caps: video/x-raw(memory:NVMM), width=(int)1, height=(int)1, framerate=(fraction)0/1, format=(string)I420
0:00:01.619959132 12552 0x7f8934037190 WARN            v4l2videodec gstv4l2videodec.c:1685:gst_v4l2_video_dec_decide_allocation:<nvv4l2decoder0> Duration invalid, not setting latency
0:00:01.620303087 12552 0x7f8934037190 DEBUG           v4l2videodec gstv4l2videodec.c:1576:gst_v4l2_video_dec_handle_frame:<nvv4l2decoder0> Starting decoding thread
0:00:01.620413833 12552 0x7f8934037190 DEBUG           v4l2videodec gstv4l2videodec.c:1349:gst_v4l2_video_dec_handle_frame:<nvv4l2decoder0> Handling frame 1
0:00:01.620470077 12552 0x7f8934037190 DEBUG           v4l2videodec gstv4l2videodec.c:1349:gst_v4l2_video_dec_handle_frame:<nvv4l2decoder0> Handling frame 2
0:00:01.620515588 12552 0x7f8934037190 DEBUG           v4l2videodec gstv4l2videodec.c:1349:gst_v4l2_video_dec_handle_frame:<nvv4l2decoder0> Handling frame 3
0:00:01.620575964 12552 0x7f8934037190 DEBUG           v4l2videodec gstv4l2videodec.c:1349:gst_v4l2_video_dec_handle_frame:<nvv4l2decoder0> Handling frame 4
0:00:01.620616567 12552 0x7f8934037190 DEBUG           v4l2videodec gstv4l2videodec.c:1349:gst_v4l2_video_dec_handle_frame:<nvv4l2decoder0> Handling frame 5
0:00:01.620657608 12552 0x7f8934037190 DEBUG           v4l2videodec gstv4l2videodec.c:1349:gst_v4l2_video_dec_handle_frame:<nvv4l2decoder0> Handling frame 6
0:00:01.620694136 12552 0x7f8934037190 DEBUG           v4l2videodec gstv4l2videodec.c:1349:gst_v4l2_video_dec_handle_frame:<nvv4l2decoder0> Handling frame 7
0:00:01.620735514 12552 0x7f8934037190 DEBUG           v4l2videodec gstv4l2videodec.c:1349:gst_v4l2_video_dec_handle_frame:<nvv4l2decoder0> Handling frame 8
0:00:01.620788366 12552 0x7f8934037190 DEBUG           v4l2videodec gstv4l2videodec.c:1349:gst_v4l2_video_dec_handle_frame:<nvv4l2decoder0> Handling frame 9
0:00:01.620835270 12552 0x7f8934037190 DEBUG           v4l2videodec gstv4l2videodec.c:1349:gst_v4l2_video_dec_handle_frame:<nvv4l2decoder0> Handling frame 10
0:00:01.620879750 12552 0x7f8934037190 DEBUG           v4l2videodec gstv4l2videodec.c:1349:gst_v4l2_video_dec_handle_frame:<nvv4l2decoder0> Handling frame 11
0:00:01.620918937 12552 0x7f8934037190 DEBUG           v4l2videodec gstv4l2videodec.c:1349:gst_v4l2_video_dec_handle_frame:<nvv4l2decoder0> Handling frame 12
0:00:01.620956976 12552 0x7f8934037190 DEBUG           v4l2videodec gstv4l2videodec.c:1349:gst_v4l2_video_dec_handle_frame:<nvv4l2decoder0> Handling frame 13
0:00:01.620995358 12552 0x7f8934037190 DEBUG           v4l2videodec gstv4l2videodec.c:1349:gst_v4l2_video_dec_handle_frame:<nvv4l2decoder0> Handling frame 14
0:00:01.621032922 12552 0x7f8934037190 DEBUG           v4l2videodec gstv4l2videodec.c:1349:gst_v4l2_video_dec_handle_frame:<nvv4l2decoder0> Handling frame 15
0:00:01.621071502 12552 0x7f8934037190 DEBUG           v4l2videodec gstv4l2videodec.c:1349:gst_v4l2_video_dec_handle_frame:<nvv4l2decoder0> Handling frame 16
0:00:01.621109258 12552 0x7f8934037190 DEBUG           v4l2videodec gstv4l2videodec.c:1349:gst_v4l2_video_dec_handle_frame:<nvv4l2decoder0> Handling frame 17
0:00:01.621149378 12552 0x7f8934037190 DEBUG           v4l2videodec gstv4l2videodec.c:1349:gst_v4l2_video_dec_handle_frame:<nvv4l2decoder0> Handling frame 18
0:00:01.621194887 12552 0x7f8934037190 DEBUG           v4l2videodec gstv4l2videodec.c:1349:gst_v4l2_video_dec_handle_frame:<nvv4l2decoder0> Handling frame 19
0:00:01.621240832 12552 0x7f8934037190 DEBUG           v4l2videodec gstv4l2videodec.c:1349:gst_v4l2_video_dec_handle_frame:<nvv4l2decoder0> Handling frame 20
0:00:01.621279026 12552 0x7f8934037190 DEBUG           v4l2videodec gstv4l2videodec.c:1349:gst_v4l2_video_dec_handle_frame:<nvv4l2decoder0> Handling frame 21
0:00:01.621317116 12552 0x7f8934037190 DEBUG           v4l2videodec gstv4l2videodec.c:1349:gst_v4l2_video_dec_handle_frame:<nvv4l2decoder0> Handling frame 22
0:00:01.621354811 12552 0x7f8934037190 DEBUG           v4l2videodec gstv4l2videodec.c:1349:gst_v4l2_video_dec_handle_frame:<nvv4l2decoder0> Handling frame 23
0:00:01.621401401 12552 0x7f8934037190 DEBUG           v4l2videodec gstv4l2videodec.c:1349:gst_v4l2_video_dec_handle_frame:<nvv4l2decoder0> Handling frame 24
0:00:01.621439960 12552 0x7f8934037190 DEBUG           v4l2videodec gstv4l2videodec.c:1349:gst_v4l2_video_dec_handle_frame:<nvv4l2decoder0> Handling frame 25
0:00:01.621485130 12552 0x7f8934037190 DEBUG           v4l2videodec gstv4l2videodec.c:1349:gst_v4l2_video_dec_handle_frame:<nvv4l2decoder0> Handling frame 26
0:00:01.621523280 12552 0x7f8934037190 DEBUG           v4l2videodec gstv4l2videodec.c:1349:gst_v4l2_video_dec_handle_frame:<nvv4l2decoder0> Handling frame 27
0:00:01.621562277 12552 0x7f8934037190 DEBUG           v4l2videodec gstv4l2videodec.c:1349:gst_v4l2_video_dec_handle_frame:<nvv4l2decoder0> Handling frame 28
0:00:01.621680045 12552 0x7f8934037190 DEBUG           v4l2videodec gstv4l2videodec.c:1349:gst_v4l2_video_dec_handle_frame:<nvv4l2decoder0> Handling frame 29
0:00:01.621753093 12552 0x7f8934037190 DEBUG           v4l2videodec gstv4l2videodec.c:1349:gst_v4l2_video_dec_handle_frame:<nvv4l2decoder0> Handling frame 30

Hi @mfoglio . I use your cli in my env. It cannot always work well. I change the decoder to avdec_h264, it still cannot get data and decoder the stream.

gst-launch-1.0 --gst-debug=libav:3 rtspsrc location=XXX  protocols=tcp latency=1000 drop-on-latency=1 timeout=5000000 ! rtph264depay ! h264parse ! avdec_h264 ! fakesink

When you use uridecoder bin, you can also change the decoder to avdec_h264, it has the same issue.
I think this is a compatible issue between gstreamer rtsp source plugin and your camera.
So We suggest you open a topic at gstreamer forum by the link below.
https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/
Thanks

Hi @yuweiw , thank you for your reply.
In the other thread https://forums.developer.nvidia.com/t/nvv4l2decoder-does-not-work-with-certain-rtsp-video-stream/ , you suggested to use uridecodebin to decode video streams. I agree with you on the fact that uridecodebin might be a better (more generic) solution. Sois there a way to decode this stream with uridecodebin?

I’d like to stress that I am just looking for any possible way to support all cameras with deepstream.
I am not interested in using a method or the other, I am just looking for a way to decode both the streams.

To sum up, as of now:

  • My custom uridecoder bin does not support the first stream I sent you. It kinda work with the second stream. ( I noticed I was using video/x-raw(memory:NVMM) in the last capsfilter if you want to try it out)
  • uridecodebin cannot decode the second video stream at all

Thank you

As a side note: I just tried again to launch:

gst-launch-1.0 uridecodebin uri=$RTSP_STREAM ! nvvideoconvert ! filesink location='test-nvv4l2decoder.mp4'

This pipeline seemed to previously work with the first stream. But I am not sure of that anymore. It saved a file, but I can’t find a way to open the file.

ffprobe version 4.2.7-0ubuntu0.1 Copyright (c) 2007-2022 the FFmpeg developers
  built with gcc 9 (Ubuntu 9.4.0-1ubuntu1~20.04.1)
  configuration: --prefix=/usr --extra-version=0ubuntu0.1 --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --arch=amd64 --enable-gpl --disable-stripping --enable-avresample --disable-filter=resample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libcodec2 --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libjack --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librsvg --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-lv2 --enable-omx --enable-openal --enable-opencl --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-nvenc --enable-chromaprint --enable-frei0r --enable-libx264 --enable-shared
  libavutil      56. 31.100 / 56. 31.100
  libavcodec     58. 54.100 / 58. 54.100
  libavformat    58. 29.100 / 58. 29.100
  libavdevice    58.  8.100 / 58.  8.100
  libavfilter     7. 57.100 /  7. 57.100
  libavresample   4.  0.  0 /  4.  0.  0
  libswscale      5.  5.100 /  5.  5.100
  libswresample   3.  5.100 /  3.  5.100
  libpostproc    55.  5.100 / 55.  5.100
[mov,mp4,m4a,3gp,3g2,mj2 @ 0x55ce58b1ff00] Format mov,mp4,m4a,3gp,3g2,mj2 detected only with low score of 1, misdetection possible!
[mov,mp4,m4a,3gp,3g2,mj2 @ 0x55ce58b1ff00] moov atom not found
test-nvv4l2decoder.mp4: Invalid data found when processing input

Note: I really doubt this has anything to do with it, but the only changes I made to my system was updating cuda on my host machine from 11.1 to 11.6 update 1.

Again, my goal is to find a way to decode the cameras. I don’t care how as long as it’s a Deepstream pipeline ;)

Thank you!

Hi @mfoglio , the pipeline you attached just save the raw data, so you cannot open it with a player. If you want to save it in a file, you can try the pipeline below,it save the data as h264 format. Then you can play it with ffmepg or other player support h264 format video.

gst-launch-1.0 uridecodebin --gst-debug=v4l2videodec:5  uri=XXX ! queue2 ! nvvideoconvert ! nvv4l2h264enc ! h264parse ! 'video/x-h264,stream-format=byte-stream' ! filesink location=test.h264

We suggeset that you use uridecoderbin in your code, cause it’s more simple and has a better compatibility.
The second rtsp source cannot send right data to decoder(either avdec_h264 or nvdecoder), so you can use the avdec_h264 test it and report a issuse to Gstreamer Forum. Thanks

Hi @yuweiw , thank you for your reply.
I think the second stream can send data correctly to avdec_h264 because we are able to process it when using avdec_h264. Still, it is not working with the nvdecoder.

Hi, @mfoglio Did you try the second stream with uridecoderbin using avdec_h264? I have tried, it also cannot receive data. But avdec_h264 will try to get data again and again, it will not stuck. You can test in your env with uridecoderbin using avdec_h264 by refer the link below:
https://forums.developer.nvidia.com/t/dynamically-deleting-a-stream-causes-a-deadlock/205040/15

Hi @yuweiw , the second streams works with both pipelines:

gst-launch-1.0 --gst-debug=v4l2videodec:5 rtspsrc location=$RTSP_STREAM  protocols=tcp latency=1000 drop-on-latency=1 timeout=5000000 ! rtph264depay ! h264parse ! nvv4l2decoder cudadec-memtype=2 num-extra-surfaces=1 ! queue leaky=2 max-size-buffers=1 ! nvvideoconvert nvbuf-memory-type=3 output-buffers=1 ! capsfilter caps=video/x-raw,format=RGBA ! filesink location='test.mp4'
gst-launch-1.0 --gst-debug=v4l2videodec:5 rtspsrc location=$RTSP_STREAM  protocols=tcp latency=1000 drop-on-latency=1 timeout=5000000 ! rtph264depay ! h264parse ! avdec_h264 ! queue leaky=2 max-size-buffers=1 ! nvvideoconvert nvbuf-memory-type=3 output-buffers=1 ! capsfilter caps=video/x-raw,format=RGBA ! filesink location='test.mp4'

I also verified that it works correctly with uridecodebin and avdec_h264:

gst-launch-1.0 uridecodebin uri=$RTSP_STREAM ! filesink location='test.mp4'

Note that to have this running you need to disable nvidia decoder. In the link you sent me ( Dynamically deleting a stream causes a deadlock - #15 by 549981178 ) there is an error: instead of cp libnvv4l2.so libnvv4l2.so.bk , you should use mv libnvv4l2.so libnvv4l2.so.bk.
Let me know if you can reproduce this.

Year, I can reproduce this. Your second source stream maybe a little special. We are analyzing
this issue.

1 Like

Hi @mfoglio , I check the stream data with WireShark. The second rtsp stream do not send the sps, pps data to decoder at the beginning. So our decoder cannot initialize.
The first rtsp stream always send the sps, pps data at the beginning.
So could you help to check why the second rtsp source do not send sps, pps data first? Thanks

Hi @yuweiw , I am not sure why that’s happening but I can tell you that the stream is coming from a camera “Axis Q6135-LE PTZ Network Camera”. This is a $2500 camera by a well-known manufacturer so I doubt it should have software issues.
I found another post from Video Codec SDK: Decoding problem reporting issues in decoding a video from an Axis camera. Quoting the post:

For the Axis camera, the callback NvDecoder::HandlePictureDisplay(CUVIDPARSERDISPINFO *pDispInfo) has never been called. As a result I have always 0 decoded frames.

Video decoding is not my expertise so I can’t understand all the technical content of the post. However, in the last post, a user found a solution for FFmpeg. Maybe it could be useful to fix the problem in the source code of nvv4l2decoder. Let me know what you think.

Thank you for your help

Hi, @mfoglio , what you infered is ffmpeg code, It’s diffrent from Gstreamer. So it’s not significant for reference. Since it can play well with v4l2videodec when I dumped the h264 stream from your rtsp source, .
So it’s not a one-sided reason. It maybe be incompatible between your special rtsp source and v4l2videodec. We are debugging for it.

Thank you, @yuweiw . Let know when you have any update. Please, don’t close the post in the meantime.

Good morning @yuweiw , do you have any update about a possible fix?

Hi @mfoglio , we have found the cause of the problem, we are tring to find a better solution.It may be fixed in the next version.
Also, could you try to set the rtsp stream to send sps&pps&IDR together at the first time of sending data?

Hi @yuweiw , unfortunately sending sps&pps&IDR together at the first time of sending data would not solve my problem as I have to deal with some cameras that to not belong to my organization, but to our customers. As a consequence, I can’t enable that on all their cameras.

OK. We are tring to resolve this issue, please wait for our future release. Thanks

Thank you!