DeepStream RTSP decoding failing, can't play with FFMPEG

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) GPU (in a Docker container)
• DeepStream Version 6.2 (from Docker image nvcr.io/nvidia/deepstream:6.2-base)
• JetPack Version (valid for Jetson only) N/A
• TensorRT Version N/A
• NVIDIA GPU Driver Version (valid for GPU only) 525.125.06
• Issue Type (questions, new requirements, bugs) Question/troubleshooting
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing) N/A
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description) N/A

Hello,

My team is using DeepStream with gstreamer inside a Docker container to process incoming RTSP/H.264 video streams (e.g. from external cameras), turn the incoming frames into NumPy arrays, do some processing on these arrays, and then construct them back into frames to push into an RTSP/H.264 output stream. The RTSP server is running in a separate Docker container on the host machine, exposed to the host network. The goal is for our RTSP output stream to look and behave exactly as if it were a regular camera stream. However, we’re having some issues. The RTSP stream works, and VLC can play it (with some visual artifacts), but seemingly nothing else can play the stream successfully - most notably FFMPEG (via the ffplay command).

I do not believe Docker is related to the issue, since I am able to produce an alternative RTSP stream within the same environment NOT using gstreamer, with no issues (see below).

What doesn’t work (using gstreamer):
Here is our output pipeline (for testing, using a 640x480 stream, 30fps):
appsrc name=input do-timestamp=true is-live=true block=true format=3 max-bytes=73728000 ! video/x-raw,format=BGRx,framerate=30/1,width=640,height=480 ! nvvideoconvert ! video/x-raw(memory:NVMM),format=I420,framerate=30/1,width=640,height=480 ! nvv4l2h264enc bitrate=4000000 profile=4 ! rtspclientsink protocols=4 location=rtsp://10.77.0.135:8554/out1
I have an RTSP server (from Docker image bluenviron/mediamtx v1.1.1) running at 10.77.0.135:8554.

When I try to play this stream with FFMPEG, here’s what I see:
FFMPEG command: ffplay rtsp://10.77.0.135:8554/out1
Result: it manages to pull the stream information, seemingly correctly, but it hangs after this output and is not able to play it:

ffplay version 4.4.2-0ubuntu0.22.04.1 Copyright (c) 2003-2021 the FFmpeg developers
  built with gcc 11 (Ubuntu 11.2.0-19ubuntu1)
  configuration: --prefix=/usr --extra-version=0ubuntu0.22.04.1 --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --arch=amd64 --enable-gpl --disable-stripping --enable-gnutls --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libcodec2 --enable-libdav1d --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libjack --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librabbitmq --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libsrt --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzimg --enable-libzmq --enable-libzvbi --enable-lv2 --enable-omx --enable-openal --enable-opencl --enable-opengl --enable-sdl2 --enable-pocketsphinx --enable-librsvg --enable-libmfx --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-libx264 --enable-shared
  libavutil      56. 70.100 / 56. 70.100
  libavcodec     58.134.100 / 58.134.100
  libavformat    58. 76.100 / 58. 76.100
  libavdevice    58. 13.100 / 58. 13.100
  libavfilter     7.110.100 /  7.110.100
  libswscale      5.  9.100 /  5.  9.100
  libswresample   3.  9.100 /  3.  9.100
  libpostproc    55.  9.100 / 55.  9.100
[rtsp @ 0x7f9bbc000cc0] decoding for stream 0 failed=    0B f=0/0   
Input #0, rtsp, from 'rtsp://10.77.0.135:8554/out1':
  Metadata:
    title           : Session streamed with GStreamer
  Duration: N/A, bitrate: N/A
  Stream #0:0: Video: h264 (High), yuv420p(tv, smpte170m/smpte170m/bt709, progressive), 640x480 [SAR 1:1 DAR 4:3], 30 fps, 30 tbr, 90k tbn, 60 tbc
    nan M-V:    nan fd=   0 aq=    0KB vq=    0KB sq=    0B f=0/0

What works (without gstreamer):
I have another RTSP stream, which takes the exact same frames as NumPy arrays, but instead of passing them to the appsrc for the gstreamer pipeline above, it sends them to another non-gstreamer RTSP streaming solution (called Vidgear, but it’s basically just a wrapper of an FFMPEG command). It sends the stream to the same RTSP server, as out2 instead of out1. This one works perfectly fine, and the only difference between the two is whether it’s using the DeepStream/gstreamer pipeline or Vidgear/FFMPEG to create the stream.
FFMPEG command: ffplay rtsp://10.77.0.135:8554/out2
Result: There are some warnings (I believe due to i-frames, etc.), but after a few seconds the stream plays successfully (the last line keeps updating as it plays):

ffplay version 4.4.2-0ubuntu0.22.04.1 Copyright (c) 2003-2021 the FFmpeg developers
  built with gcc 11 (Ubuntu 11.2.0-19ubuntu1)
  configuration: --prefix=/usr --extra-version=0ubuntu0.22.04.1 --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --arch=amd64 --enable-gpl --disable-stripping --enable-gnutls --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libcodec2 --enable-libdav1d --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libjack --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librabbitmq --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libsrt --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzimg --enable-libzmq --enable-libzvbi --enable-lv2 --enable-omx --enable-openal --enable-opencl --enable-opengl --enable-sdl2 --enable-pocketsphinx --enable-librsvg --enable-libmfx --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-libx264 --enable-shared
  libavutil      56. 70.100 / 56. 70.100
  libavcodec     58.134.100 / 58.134.100
  libavformat    58. 76.100 / 58. 76.100
  libavdevice    58. 13.100 / 58. 13.100
  libavfilter     7.110.100 /  7.110.100
  libswscale      5.  9.100 /  5.  9.100
  libswresample   3.  9.100 /  3.  9.100
  libpostproc    55.  9.100 / 55.  9.100
[h264 @ 0x7fb870006300] co located POCs unavailableq=    0B f=0/0   
Input #0, rtsp, from 'rtsp://10.77.0.135:8554/out2':=    0B f=0/0   
  Metadata:
    title           : No Name
  Duration: N/A, start: 1.130533, bitrate: N/A
  Stream #0:0: Video: h264 (High), yuv420p(progressive), 640x480, 30 fps, 30 tbr, 90k tbn, 60 tbc
[h264 @ 0x7fb870044b80] co located POCs unavailable
[h264 @ 0x7fb870351000] co located POCs unavailable
[h264 @ 0x7fb870338580] mmco: unref short failure
   7.68 M-V:  0.017 fd=   0 aq=    0KB vq=  195KB sq=    0B f=0/0

As a reminder, VLC is able to play both of these streams successfully (though the artifacting behavior is different between the two). In contrast, FFMPEG can successfully play out2, but not out1 (which is the DeepStream/gstreamer one).

What I notice looking at these two FFMPEG outputs:

  • For out1 it can’t get a “start” time, presumably because it isn’t receiving frames correctly
  • The stream information is almost identical for both, except that for out1 it shows yuv420p(tv, smpte170m/smpte170m/bt709, progressive), while for out2 it only shows yuv420p(progressive). Also, for out1 it shows a SAR/DAR, while out2 does not.
  • out1 shows an error: [rtsp @ 0x7f9bbc000cc0] decoding for stream 0 failed= 0B f=0/0 which I believe is the root of the problem
  • out2 shows a few warnings, but then it starts playing successfully and seems to be fine

My overall question:
I want the DeepStream/gstreamer RTSP stream (out1) to behave the same way as out2. Are there any issues or changes to my pipeline that could resolve this? Or could the issue be an incompatibility with the gstreamer pipeline and the “mediamtx” RTSP server? Is it related to the DeepStream components, or gstreamer in general? I suspect the issue is closely related to the decoding for stream 0 failed error, but I am unsure what else to try after messing around with some of the pipeline components with no success.

Thanks!

Please add “h264parse” and “rtph264pay” before “rtspclientsink” if you just want to transfer the raw H264 streams. RTP( RFC 3550: RTP: A Transport Protocol for Real-Time Applications (rfc-editor.org)) header will make the RTSP to identify what the payload is. RFC 2326 - Real Time Streaming Protocol (RTSP) (ietf.org) . It is better for you to read the protocol document for why.

An other suggestion is that you should set a shorter idr interval to the encoder to make the client can start playback easier.

Thank you for this info! I tried adding h264parse and rtph264pay but got the could not link rtph264pay0 to rtspclientsink0 error. However, removing rtph264pay and just using h264parse before rtspclientsink allowed the pipeline to run. But it still had the same issues. I found that the idrinterval was not available for the nvv4l2h264enc component in Deepstream 6.2, but updating to Deepstream 6.3 seemed to fix the issue entirely (and also allow me to use idrinterval). With 6.3, the stream now plays well in VLC, FFMPEG, and other video clients.

I believe this may be a related issue, but I’m not sure: Deepstream 6.2version seems not produce idr periodically

A live stream have to receive at least an idr frame to decode. Because DS6.2 don’t produce idr any more, except the first frame, your vlc or ffmpeg try to pull stream won’t get idr once it already dispear. if pulling the stream first and do not waiting until timeout before pushing the stream, you can play this video stream. But it’s very inconvient

Yes. DeepStream 6.3 has fixed the problem. Please use DeepStream 6.3.

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.