RTSP Output Artifacting with Multiple Sinks in Customized deepstream-app inference pipeline

• Hardware Platform (dGPU - RTX 4090)
• DeepStream Version 7.0
• NVIDIA GPU Driver Version (535.230.02 CUDA ver 12.2)
• Issue Type( question / bug)

I’m running a customized deepstream-app pipeline with modifications to the analytics_done_buf_probe for overlaying items onto the OSD and some detection algorithms. The pipeline uses 4 RTSP URI sources with a batch size of 4, streaming output via RTSP on ports 8553–8556.

I’m encountering severe artifacting and freezes in the output RTSP streams, especially during movement. VLC often fails to load the stream or takes a long time, and once it does, it either freezes on a frame or shows artifacts until the pipeline is terminated. However, when using EGLsink or a file-sink, the output is smooth and free of artifacts even with multiple streams.

I’ve tried various configurations with both nvstreammux and new_nvstreammux, adjusted sink properties, and settings, but the problem persists across all four streams. While I suspect network limitations may contribute, others have reported improvements through config tweaks.

According to my requirements, I need the RTSP output with minimal artifacting and being atleast similar to the input stream in terms of latency and artifacts. Please let me know if I need any more config changes to potentially improve performance. Each camera is supposed to provide 25fps at 1920x1080 input.

Another detail to note, I tried the same config on an RTX 3090 based system but i couldn’t manage to run the hardware based encoding setting on it and tried with Software encoding, getting the same issues but a bit worse in terms of pixelation and artifacting.

deepstream-app config:

[application]
enable-perf-measurement = 1
perf-measurement-interval-sec = 5
kitti-track-output-dir = /opt/nvidia/deepstream/deepstream-7.0/sources/apps/sample_apps/deepstream-app/detections/

[tiled-display]
enable=0
rows=1
columns=1
width=1280
height=720
gpu-id=0
nvbuf-memory-type=0

[source0]
enable=1
type=4
uri=rtsp://link1
num-sources=1
gpu-id=0
cudadec-memtype=0
latency=0
select-rtp-protocol=4


[source1]
enable=1
type=4
uri=rtsp://link2
num-sources=1
gpu-id=0
cudadec-memtype=0
latency=0
select-rtp-protocol=4

[source2]
enable=1
type=4
uri=rtsp://link3
num-sources=1
gpu-id=0
cudadec-memtype=0
latency=0
select-rtp-protocol=4

[source3]
enable=1
type=4
uri=rtsp://link4
num-sources=1
gpu-id=0
cudadec-memtype=0
latency=0
select-rtp-protocol=4

[sink0]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File 4=RTSPStreaming
type=4
gpu-id=0
rtsp-port=8553 # <  change port
#1=h264 2=h265
codec=2
source-id=0 # indicate source-id here
container=1
output-file=out.mp4
#encoder type 0=Hardware 1=Software
enc-type=0
sync=0
bitrate=8000000
#H264 Profile - 0=Baseline 2=Main 4=High
#H265 Profile - 0=Main 1=Main10
profile=1
nvbuf-memory-type=1

[sink1]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File 4=RTSPStreaming
type=4
gpu-id=0
rtsp-port=8554 # <  change port
#1=h264 2=h265
codec=2
source-id=1 # indicate source-id here
container=1
output-file=out.mp4
#encoder type 0=Hardware 1=Software
enc-type=0
sync=0
bitrate=8000000
#H264 Profile - 0=Baseline 2=Main 4=High
#H265 Profile - 0=Main 1=Main10
profile=1
nvbuf-memory-type=1

[sink2]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File 4=RTSPStreaming
type=4
gpu-id=0
rtsp-port=8555 # <  change port
#1=h264 2=h265
codec=2
source-id=2 # indicate source-id here
container=1
output-file=out.mp4
#encoder type 0=Hardware 1=Software
enc-type=0
sync=0
bitrate=8000000
#H264 Profile - 0=Baseline 2=Main 4=High
#H265 Profile - 0=Main 1=Main10
profile=1
nvbuf-memory-type=1

[sink3]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File 4=RTSPStreaming
type=4
gpu-id=0
rtsp-port=8556 # <  change port
#1=h264 2=h265
codec=2
source-id=3 # indicate source-id here
container=1
output-file=out.mp4
#encoder type 0=Hardware 1=Software
enc-type=0
sync=0
bitrate=8000000
#H264 Profile - 0=Baseline 2=Main 4=High
#H265 Profile - 0=Main 1=Main10
profile=1
nvbuf-memory-type=1

[osd]
enable = 1
gpu-id = 0
border-width = 0
text-size = 15
text-color = 1;1;1;0
text-bg-color = 0.3;0.3;0.3;0
font = Serif
nvbuf-memory-type = 0

[streammux]
gpu-id = 0
live-source = 1
batch-size = 4
batched-push-timeout = 20000
width=1280
height=720
sync-inputs=0
enable-padding = 1
nvbuf-memory-type = 1
config-file=/opt/nvidia/deepstream/deepstream-7.0/sources/apps/sample_apps/deepstream-app/config/newnvstreammux.txt

[primary-gie]
enable = 1
gpu-id = 0
gie-unique-id = 1
nvbuf-memory-type = 0
config-file = /opt/nvidia/deepstream/deepstream-7.0/sources/apps/sample_apps/deepstream-app/config/config_infer_primary_yoloV5.txt

[tracker]
enable = 1
gpu-id = 0
tracker-height = 640
tracker-width = 480
ll-lib-file = /opt/nvidia/deepstream/deepstream-7.0/lib/libnvds_nvmultiobjecttracker.so
ll-config-file = /opt/nvidia/deepstream/deepstream-7.0/samples/configs/deepstream-app/config_tracker_NvDCF_perf.yml

[optical-flow]
enable = 0


[tests]
file-loop = 0

newnvstreammux_config:

[property]
algorithm-type=1

max-fps-control=0

overall-max-fps-n=30

overall-max-fps-d=1

overall-min-fps-n=30

overall-min-fps-d=1

max-same-source-frames=1
  1. about “VLC often fails to load the stream or takes a long time”, please refer to this topic.
  2. About the artifacts issue, if you play the output rtsp on the device running the deepstream-app, will the artifacts issue persist? wondering if it is related with the network. About “EGLsink or a file-sink, the output is smooth”, do you mean using 4 filesink, all 4 files has no artifacts issue?
  3. if the fps of input source is 25, please set batched-push-timeout to 40000. bitrate=8000000 is too high for 720p. please set bitrate to 2000000. please refer to faq for more improvement methods.

Hi,

  • I managed to run through vlc somehow, but it still lags whenever there’s multiple sinks present, so it takes a long while to load even with idrinterval=60 , with a single stream as input it doesn’t take too long to load

  • The artifacts are only present when i use the RTSP output type for sinks. If i used EGLsink, I could get a smooth stream with no artifacting and similar for file-sink type which recorded the output file as an mp4 and it didn’t have any artifacting with the tiled display enabled.

  • I changed these settings to the recommended values and still ended up with artifacting, here’s a sample with a single input stream running for more than 10 minutes on a T4 gpu instance, this grey artifacting happened whenever movement was encountered on the stream.

current config :

[application]
enable-perf-measurement = 1
perf-measurement-interval-sec = 5
kitti-track-output-dir = /opt/nvidia/deepstream/deepstream-7.0/sources/apps/sample_apps/deepstream-app/detections/

[tiled-display]
enable=0
rows=1
columns=1
width=1280
height=720
gpu-id=0
nvbuf-memory-type=0

[source0]
enable=1
type=4
uri=rtsp://link1
num-sources=1
gpu-id=0
cudadec-memtype=0
latency=0
select-rtp-protocol=4


[sink0]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File 4=RTSPStreaming
type=4
gpu-id=0
rtsp-port=8553 # <  change port
#1=h264 2=h265
codec=2
source-id=0 # indicate source-id here
container=1
output-file=out.mp4
#encoder type 0=Hardware 1=Software
enc-type=0
sync=0
bitrate=2000000
#H264 Profile - 0=Baseline 2=Main 4=High
#H265 Profile - 0=Main 1=Main10
profile=1
nvbuf-memory-type=1


[osd]
enable = 1
gpu-id = 0
border-width = 0
text-size = 15
text-color = 1;1;1;0
text-bg-color = 0.3;0.3;0.3;0
font = Serif
nvbuf-memory-type = 0

[streammux]
gpu-id = 0
live-source = 1
batch-size = 1
batched-push-timeout = 400000
width=1280
height=720
sync-inputs=0
enable-padding = 1
nvbuf-memory-type = 1

[primary-gie]
enable = 1
gpu-id = 0
gie-unique-id = 1
nvbuf-memory-type = 0
config-file = /opt/nvidia/deepstream/deepstream-7.0/sources/apps/sample_apps/deepstream-app/config/config_infer_primary_yoloV5.txt

[tracker]
enable = 1
gpu-id = 0
tracker-height = 640
tracker-width = 480
ll-lib-file = /opt/nvidia/deepstream/deepstream-7.0/lib/libnvds_nvmultiobjecttracker.so
ll-config-file = /opt/nvidia/deepstream/deepstream-7.0/samples/configs/deepstream-app/config_tracker_NvDCF_perf.yml

[optical-flow]
enable = 0


[tests]
file-loop = 0

Here are some methods to narrow donwn this issue.

  1. if using one rtsp input and eglsink, will artifacts issue persist after 10 minutes. wondering if the issue is related to the source.
  2. if using one rtsp input and rtspsink, to rule out the network, if you play the output rtsp on the device running the deepstream-app, will the artifacts issue persist? you can use the following cmd or ffplay.
gst-launch-1.0 uridecodebin uri=rtsp://xxx !  nveglglessink

here’s the output ffplay logs for single rtsp source input and output from deepstream

ffplay -i "rtsp://127.0.0.1:8553/ds-test"
ffplay version 4.4.2-0ubuntu0.22.04.1 Copyright (c) 2003-2021 the FFmpeg developers
  built with gcc 11 (Ubuntu 11.2.0-19ubuntu1)
  configuration: --prefix=/usr --extra-version=0ubuntu0.22.04.1 --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --arch=amd64 --enable-gpl --disable-stripping --enable-gnutls --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libcodec2 --enable-libdav1d --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libjack --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librabbitmq --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libsrt --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzimg --enable-libzmq --enable-libzvbi --enable-lv2 --enable-omx --enable-openal --enable-opencl --enable-opengl --enable-sdl2 --enable-pocketsphinx --enable-librsvg --enable-libmfx --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-libx264 --enable-shared
  libavutil      56. 70.100 / 56. 70.100
  libavcodec     58.134.100 / 58.134.100
  libavformat    58. 76.100 / 58. 76.100
  libavdevice    58. 13.100 / 58. 13.100
  libavfilter     7.110.100 /  7.110.100
  libswscale      5.  9.100 /  5.  9.100
  libswresample   3.  9.100 /  3.  9.100
  libpostproc    55.  9.100 / 55.  9.100
pci id for fd 8: 1414:0006, driver (null)
MESA-LOADER: failed to open hyperv_drm: /usr/lib/dri/hyperv_drm_dri.so: cannot open shared object file: No such file or directory (search paths /usr/lib/x86_64-linux-gnu/dri:\$${ORIGIN}/dri:/usr/lib/dri, suffix _dri)
pci id for fd 9: 1414:0006, driver (null)
kmsro: driver missing
[hevc @ 0x75dd980062c0] PPS id out of range: 00KB sq=    0B f=0/0   
    Last message repeated 1 times
[hevc @ 0x75dd980062c0] Error parsing NAL unit #0.
[hevc @ 0x75dd980062c0] PPS id out of range: 00KB sq=    0B f=0/0   
    Last message repeated 1 times
[hevc @ 0x75dd980062c0] Error parsing NAL unit #0.
[hevc @ 0x75dd980062c0] PPS id out of range: 0
    Last message repeated 1 times
[hevc @ 0x75dd980062c0] Error parsing NAL unit #0.
[rtsp @ 0x75dd98000cc0] max delay reached. need to consume packet   
[rtsp @ 0x75dd98000cc0] RTP: missed 54 packets
Input #0, rtsp, from 'rtsp://127.0.0.1:8553/ds-test':    0B f=0/0   
  Metadata:
    title           : Session streamed with GStreamer
    comment         : rtsp-server
  Duration: N/A, start: 0.065278, bitrate: N/A
  Stream #0:0: Video: hevc (Main 10), yuv420p(tv, bt709), 1920x1080 [SAR 1:1 DAR 16:9], 30 fps, 29.92 tbr, 90k tbn, 30 tbc
[hevc @ 0x75dd98038580] Could not find ref with POC -20
[hevc @ 0x75dd98038580] Could not find ref with POC -21  0B f=0/0   
[hevc @ 0x75dd98038580] Could not find ref with POC -22
[hevc @ 0x75dd98038580] Could not find ref with POC -23
[hevc @ 0x75dd980de7c0] Could not find ref with POC -5   0B f=0/0   
[hevc @ 0x75dd980de7c0] Could not find ref with POC -6
[hevc @ 0x75dd980de7c0] Could not find ref with POC -7
[hevc @ 0x75dd980de7c0] Could not find ref with POC -8
[rtsp @ 0x75dd98000cc0] max delay reached. need to consume packet
[rtsp @ 0x75dd98000cc0] RTP: missed 38 packets
[rtsp @ 0x75dd98000cc0] max delay reached. need to consume packet
[rtsp @ 0x75dd98000cc0] RTP: missed 33 packets
[hevc @ 0x75dd98083000] Could not find ref with POC 21   0B f=0/0   
[hevc @ 0x75dd98083000] Could not find ref with POC 20
[hevc @ 0x75dd98083000] Could not find ref with POC 19
[rtsp @ 0x75dd98000cc0] max delay reached. need to consume packet   
[rtsp @ 0x75dd98000cc0] RTP: missed 77 packets
[hevc @ 0x75dd980de7c0] Could not find ref with POC 46   0B f=0/0   
[hevc @ 0x75dd980de7c0] Could not find ref with POC 45
[hevc @ 0x75dd980de7c0] Could not find ref with POC 44
[hevc @ 0x75dd980cdf40] Could not find ref with POC 70   0B f=0/0   
[hevc @ 0x75dd980cdf40] Could not find ref with POC 69
[hevc @ 0x75dd980cdf40] Could not find ref with POC 68
[hevc @ 0x75dd980cdf40] Could not find ref with POC 67
[rtsp @ 0x75dd98000cc0] max delay reached. need to consume packet
[rtsp @ 0x75dd98000cc0] RTP: missed 48 packets
[hevc @ 0x75dd980cdf40] Could not find ref with POC 98   0B f=0/0   
[hevc @ 0x75dd980cdf40] Could not find ref with POC 97
[hevc @ 0x75dd980cdf40] Could not find ref with POC 96
[hevc @ 0x75dd980cdf40] Could not find ref with POC 95
[rtsp @ 0x75dd98000cc0] max delay reached. need to consume packet
[rtsp @ 0x75dd98000cc0] RTP: missed 73 packets
[hevc @ 0x75dd98038580] Could not find ref with POC 123  0B f=0/0   
[hevc @ 0x75dd98038580] Could not find ref with POC 122
[hevc @ 0x75dd98038580] Could not find ref with POC 121
[hevc @ 0x75dd98038580] Could not find ref with POC 120
[rtsp @ 0x75dd98000cc0] max delay reached. need to consume packet   
[rtsp @ 0x75dd98000cc0] RTP: missed 19 packets
[hevc @ 0x75dd98083000] Could not find ref with POC 147  0B f=0/0   
[hevc @ 0x75dd98083000] Could not find ref with POC 146
[rtsp @ 0x75dd98000cc0] max delay reached. need to consume packet
[rtsp @ 0x75dd98000cc0] RTP: missed 38 packets
[hevc @ 0x75dd98083000] Could not find ref with POC 171  0B f=0/0   
[hevc @ 0x75dd98083000] Could not find ref with POC 170
[hevc @ 0x75dd98083000] Could not find ref with POC 169
[hevc @ 0x75dd98083000] Could not find ref with POC 168
[rtsp @ 0x75dd98000cc0] max delay reached. need to consume packet
[rtsp @ 0x75dd98000cc0] RTP: missed 78 packets
[hevc @ 0x75dd98083000] Could not find ref with POC 200  0B f=0/0   
[hevc @ 0x75dd98083000] Could not find ref with POC 199
[hevc @ 0x75dd98083000] Could not find ref with POC 198
[hevc @ 0x75dd98083000] Could not find ref with POC 197
[rtsp @ 0x75dd98000cc0] max delay reached. need to consume packet   
[rtsp @ 0x75dd98000cc0] RTP: missed 49 packets
[hevc @ 0x75dd980cdf40] Could not find ref with POC 39   0B f=0/0   
[hevc @ 0x75dd980cdf40] Could not find ref with POC 38
[hevc @ 0x75dd980cdf40] Could not find ref with POC 37
[hevc @ 0x75dd980cdf40] Could not find ref with POC 36
[rtsp @ 0x75dd98000cc0] max delay reached. need to consume packet
[rtsp @ 0x75dd98000cc0] RTP: missed 57 packets
[hevc @ 0x75dd980cdf40] Could not find ref with POC 65   0B f=0/0   
[hevc @ 0x75dd980cdf40] Could not find ref with POC 64
[hevc @ 0x75dd980cdf40] Could not find ref with POC 63
[hevc @ 0x75dd980cdf40] Could not find ref with POC 62
[rtsp @ 0x75dd98000cc0] max delay reached. need to consume packet
[rtsp @ 0x75dd98000cc0] RTP: missed 101 packets
[rtsp @ 0x75dd98000cc0] max delay reached. need to consume packet   
[rtsp @ 0x75dd98000cc0] RTP: missed 22 packets
[hevc @ 0x75dd98083000] Could not find ref with POC 90   0B f=0/0   
[hevc @ 0x75dd98083000] Could not find ref with POC 89
[hevc @ 0x75dd98083000] Could not find ref with POC 88
[hevc @ 0x75dd98083000] Could not find ref with POC 87
[rtsp @ 0x75dd98000cc0] max delay reached. need to consume packet   
[rtsp @ 0x75dd98000cc0] RTP: missed 18 packets
[rtsp @ 0x75dd98000cc0] max delay reached. need to consume packet   
[rtsp @ 0x75dd98000cc0] RTP: missed 33 packets
[hevc @ 0x75dd98075300] Could not find ref with POC 140  0B f=0/0   
[hevc @ 0x75dd98038580] Could not find ref with POC 164  0B f=0/0   
[hevc @ 0x75dd98038580] Could not find ref with POC 163
[hevc @ 0x75dd98038580] Could not find ref with POC 162
[hevc @ 0x75dd98038580] Could not find ref with POC 161
[rtsp @ 0x75dd98000cc0] max delay reached. need to consume pack

i tried running it through the gst command, however it failed

gst-launch-1.0 uridecodebin uri=rtsp://localhost:8553/ds-test !  nveglglessink
Setting pipeline to PAUSED ...
Failed to set pipeline to PAUSED.
Setting pipeline to NULL ...
Freeing pipeline ...

and i can access it via vlc and i’m still getting artifacting in the output stream through deepstream.

are you playing the output rtsp with ffplay on the device running the DeepStream application? seems there are many h265 decoding error. can you try h264 encoding (codec=1)?

hi, changing the codec to h.264 helped a bit, adjusted the profile=4 (high) as well, the grey artifacting is gone from a single source rtsp however normal pixelations and the regular artifacts exist

and this is the output for output via http live streaming

Hi one update, i tried with a sample video stream as RTSP seems like its consistent for a bit, monitoring it for longer to see if theres any artifacting in single stream performance. Seems like even with HLS it is consistent for a single stream with h.264, may have been some issue with other streams.

please let me know if you have any suggestions for multi stream output and ways to mitigate network irregularities to avoid artifacts.

  1. what do you mean about “via http live streaming”? noticing the output type is rtsp, why is there http live streaming?
  2. do you mean using sample video stream as RTSP and rtsp sink, playing output rtsp is fine for a long time? if so, seems the issue is related to the source. could you double check " when using EGLsink or a file-sink in deepstream-app, the output is smooth and free of artifacts even with multiple streams."?

Hi for HLS streaming, I aim to view the RTSP via HLS as a separate service, i compared with both vlc and HLS output through a web server, both had same results in terms of output when switched to h.264 encoder

As for testing with RTSP, i converted the sample mp4 file into an rtsp stream which can be played locally having the video playing back in loop as an RTSP. I used that as input uri source for deepstream. I tested with file sink, with single and multi source input, I could get a clean output without any artifacts or delay. EGLsink also worked in a similar manner earlier, however I cannot properly test EGLsink right now as I’m using a VM with T4 gpu for testing without any display. But earlier, while testing on an RTX3090 based system with rtsp based IP cam, I could get a steady output with EGLsink even with multiple streams running at the same time. And steady compared to RTSP type sink, the RTSP sink had too many artifacts and greying out [note eglsink test was with software based encoding with h.264 as RTX3090 has some bug with hardware encoding]

Here are some methods to narrow down this issue.

  1. to rule out the network issue, please don’t play output rtsp on the web. please play the output rtsp on the device running the deepstream-app, as mentioned above.
  2. do you mean using multiple sources(sample mp4 file into an rtsp stream) and multiple rtsp sink, playing all output rtsp is fine for a long time? if so. the only difference with testing other RTSP is the source.
  3. if using one rtsp source A (not mp4 file into an rtsp stream) and one rtsp sink, is playing the output rtsp fine for a long time?
  4. if using one rtsp source B (not mp4 file into an rtsp stream) and one output sink, is playing the output rtsp fine for a long time?
  5. if using two rtsp sources A,B(not mp4 file into an rtsp stream) and two output sink,
    is playing all output rtsp is fine for a long time?

Hi,

  1. the HLS Streaming is done on the same network as rtsp is viewed, it’s just using ffmpeg to generate an HLS format and viewed on localhost, however i switched to vlc and ffmpeg on the same device for testing the following scenarios

  2. using multiple sources with the same RTSP link i.e. source0-4 used the same rtsp link which was set up locally as a fake rtsp server generating rtsp links with ffmpeg out of the sample.mp4 video and looping it internally with the help of ffmpeg. On checking the generated output by deepstream, none of the sources showed artifacts with filesink or eglsink, the FPS was slowed down with 4 sources however the artifacts were not present with filesink or EGLSink. However, when the same setup is used with an RTSP sink and above optimized config, the artifacts are still present as shown in screenshots below.

  3. what do you mean by “not mp4 file into an rtsp stream” rtsp source A and B ? If you mean a simple rtsp stream from a different network, then yes playing the outputs with a single source resulted in minimal artifacts compared to having multiple sources. So RTSP output with rtsp source A only resulted in minimal artifacts and remained stable for above 20 minutes.

Also if you meant as a fake rtsp link on the same network as the environment, that also resulted in no artifacts over a longer duration [this had the sample.mp4 generated as an rtsp link separately through a fake rtsp generator using ffmpeg and looping the video endlessly as mentioned above].

  1. Consistent with point 2, output sink with filesink or eglsink, i couldn’t get an RTSP output as its a file output or eglsink output, the output stayed consistent and without any artifacts, more stable compared to RTSP sink in point 3.

  2. While using 2 rtsp sources A and B, the output stops and reaches EOS automatically after a while and had heavy artifacting in the RTSP output as attached in screenshots and text below: (note fps started at 20 and reduced to 11/ 13 on average after 4-5 minutes and then after 7 minutes , the stream reached EOS automatically, RTSP input links were still active while stream reached EOS. [Note, After increasing interval from pgie config from 1 to 4, the stream was more consistent and didn’t reach eos suddenly, however artifacts still persisted, this was only present for multiple sources and sinks]

nvstreammux: Successfully handled EOS for source_id=0
nvstreammux: Successfully handled EOS for source_id=1
** INFO: <bus_callback:330>: Received EOS. Exiting ...

Quitting
[NvMultiObjectTracker] De-initialized
App run successful

[Additional Clarification : would be great if you could elaborate difference between rtsp sink and output sink with rtsp output]

  1. if using deepstream-app, rtsp sink and output sink are the same. deepstream-app is opensource. the app will send the encoded video stream to udpsink, the uses Gstreamer module GstRTSPServer to support RTSP/RTP protocol.
  2. from your description " when using a file-sink, the output is smooth". the artifact issue should occur after encoding. if using two RTSP source(simple rtsp stream from a different network, not fake rtsp) and rtsp sink, could you use the following cmd to record 1 minute of output RTSP? (the output RTSP uri will be printed on the terminal. please provide two files, one is for [sink0] while the other is for [sink1]). Especially to rule the network issue, please run the cmd on the device running the deepstreama-app.
gst-launch-1.0 rtspsrc location=XXX ! rtph264depay ! h264parse ! 'video/x-h264,stream-format=byte-stream' ! filesink location=test.h264

Hi, I tried the recommended recording procedure with two rtsp sources (on different networks with 3fps each on the stream) got these logs for both output rtsp sinks
1.

gst-launch-1.0 rtspsrc location=rtsp://127.0.0.1:8550/ds-test ! rtph264depay ! h264parse ! 'video/x-h264,stream-format=byte-stream' ! filesink location=sink0_output.h264
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Progress: (open) Opening Stream
Pipeline is PREROLLED ...
Prerolled, waiting for progress to finish...
Progress: (connect) Connecting to rtsp://127.0.0.1:8550/ds-test
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
Progress: (request) SETUP stream 0
Progress: (open) Opened Stream
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Progress: (request) Sending PLAY request
Redistribute latency...
Progress: (request) Sending PLAY request
Redistribute latency...
ERROR: from element /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0: Unhandled error
Additional debug info:
../gst/rtsp/gstrtspsrc.c(6795): gst_rtspsrc_send (): /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0:
Internal Server Error (500)
ERROR: from element /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0: Could not write to resource.
Additional debug info:
../gst/rtsp/gstrtspsrc.c(8905): gst_rtspsrc_play (): /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0:
Could not send message. (Generic error)
Execution ended after 0:00:00.001913060
Setting pipeline to NULL ...
Freeing pipeline ...

It tried decoding a few frames for under a second but stopped mid way and gave this error. I tried the same with ffprobe i got this :

ffprobe -i "rtsp://127.0.0.1:8550/ds-test"
ffprobe version 4.4.2-0ubuntu0.22.04.1 Copyright (c) 2007-2021 the FFmpeg developers
  built with gcc 11 (Ubuntu 11.2.0-19ubuntu1)
  configuration: --prefix=/usr --extra-version=0ubuntu0.22.04.1 --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --arch=amd64 --enable-gpl --disable-stripping --enable-gnutls --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libcodec2 --enable-libdav1d --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libjack --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librabbitmq --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libsrt --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzimg --enable-libzmq --enable-libzvbi --enable-lv2 --enable-omx --enable-openal --enable-opencl --enable-opengl --enable-sdl2 --enable-pocketsphinx --enable-librsvg --enable-libmfx --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-libx264 --enable-shared
  libavutil      56. 70.100 / 56. 70.100
  libavcodec     58.134.100 / 58.134.100
  libavformat    58. 76.100 / 58. 76.100
  libavdevice    58. 13.100 / 58. 13.100
  libavfilter     7.110.100 /  7.110.100
  libswscale      5.  9.100 /  5.  9.100
  libswresample   3.  9.100 /  3.  9.100
  libpostproc    55.  9.100 / 55.  9.100
[rtsp @ 0x603332985340] max delay reached. need to consume packet
[rtsp @ 0x603332985340] RTP: missed 1557 packets
[rtsp @ 0x603332985340] RTP: dropping old packet received too late
    Last message repeated 26 times
[h264 @ 0x60333298a7c0] illegal short term buffer state detected
[rtsp @ 0x603332985340] RTP: dropping old packet received too late
    Last message repeated 28 times
[h264 @ 0x60333298a7c0] Missing reference picture, default is 188
[rtsp @ 0x603332985340] RTP: dropping old packet received too late
    Last message repeated 753 times
Input #0, rtsp, from 'rtsp://127.0.0.1:8550/ds-test':
  Metadata:
    title           : Session streamed with GStreamer
    comment         : rtsp-server
  Duration: N/A, start: 1.975222, bitrate: N/A
  Stream #0:0: Video: h264 (High), yuv420p(tv, bt709, progressive), 1280x720 [SAR 1:1 DAR 16:9], 3 fps, 3 tbr, 90k tbn, 6 tbc

I tried running with vlc as well but it caused artifacts after running for 10 minutes, not as heavily as earlier but still had artifacts.

  1. When i tried with a single source and single sink, i could record the rtsp as an h.264 file with gst-launch-1.0, there were still some artifacts present in the output (this was with the same rtsp link used in above test)

Note this is only with one uri source and 1 rtsp output sink

  1. While testing with new rtsp links from a different network, i tried getting the file-sink output as an mp4, i also got similar artifacts now all of a sudden [note: single uri source], my config is the same as earlier if you need to take a look:
[application]
enable-perf-measurement = 1
perf-measurement-interval-sec = 5
kitti-track-output-dir = /opt/nvidia/deepstream/deepstream-7.0/sources/apps/sample_apps/deepstream-app/detections/

[tiled-display]
enable=0
rows=1
columns=1
width=1280
height=720
gpu-id=0
nvbuf-memory-type=0

[source0]
enable=1
type=4
uri=rtsp://link
num-sources=1
gpu-id=0
cudadec-memtype=0
latency=0
select-rtp-protocol=4


[sink0]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File 4=RTSPStreaming
type=3 #changed to file sink here
gpu-id=0
rtsp-port=8550 # <  change port
#1=h264 2=h265
codec=1
source-id=0 # indicate source-id here
container=1
output-file=out_sink0.mp4
#encoder type 0=Hardware 1=Software
enc-type=0
sync=0
bitrate=2000000
#H264 Profile - 0=Baseline 2=Main 4=High
#H265 Profile - 0=Main 1=Main10
profile=4
nvbuf-memory-type=1



[osd]
enable = 1
gpu-id = 0
border-width = 0
text-size = 15
text-color = 1;1;1;0
text-bg-color = 0.3;0.3;0.3;0
font = Serif
nvbuf-memory-type = 0

[streammux]
gpu-id = 0
live-source = 1
batch-size = 2
batched-push-timeout = 400000
width=1280
height=720
sync-inputs=0
enable-padding = 1
nvbuf-memory-type = 1

[primary-gie]
enable = 1
gpu-id = 0
gie-unique-id = 1
nvbuf-memory-type = 0
config-file = /opt/nvidia/deepstream/deepstream-7.0/sources/apps/sample_apps/deepstream-app/config/config_infer_primary_yoloV5.txt

[tracker]
enable = 1
gpu-id = 0
tracker-height = 640
tracker-width = 480
ll-lib-file = /opt/nvidia/deepstream/deepstream-7.0/lib/libnvds_nvmultiobjecttracker.so
ll-config-file = /opt/nvidia/deepstream/deepstream-7.0/samples/configs/deepstream-app/config_tracker_NvDCF_perf.yml

[optical-flow]
enable = 0


[EDIT : switched batch-size of streammux to 1 and still got same artifacts, is it because of rtsp input having these bad frames?]

let’s use this rtsp source to narrow down the artifacts issue. seems using one rtsp source and one filesink, the artifacts issue persists.

  1. “latency=0” is too small. please try “latency=1500”.
  2. what is the fps of original rtsp source? you can use ffprobe to check. what is the fps of this mp4?
  3. please refer to this topic. you can set drop-on-latency of rtspsrc to false. meaning “don’t drop the oldest buffers”.
  1. Switched the latency to 1500, and still got similar artifacts, I forgot to use the New NV streammux for last test, but after using it, the stream blurriness reduced however artifacts may have become a bit more apparent.
  2. Original source fps is 3 fps, checked with ffporbe. both RTSP source and mp4 gave fps=3 with ffprobe.
  3. Switched to drop-on-latency=false seems like its better for single source rtsp and single output rtsp recording however when switching to 2 input sources and 2 output rtsp sinks, the outputs had a lot more artifacts.
    I tried saving file sink with both single source rtsp and 2 sources, however the output files couldn’t be read for some reason here’s the ffprobe log: [mov,mp4,m4a,3gp,3g2,mj2 @ 0x61cbd6abd340] moov atom not found out1.mp4: Invalid data found when processing input
  1. if you are using old nvstreammux, Since the fps of two camera are 3, please set batched-push-timeout to 333333, meaning 1000000/max_fps= 1000000/3. if the fps is 25, please set batched-push-timeout to 40000 (not 400000).
  2. Seems using two rtsp sources and output rtsp sinks, the artifacts issue persists. Please run ‘nvidia-smi dmon -o D >1.log’ first, then use the following cmd to run again with the same source and sink configurations, then provide the log 1.log, 2.log. you can use zip tool to compress the file.
export GST_DEBUG=3,rtpjitterbuffer:6,h264parse:6,v4l2videodec:6,videodecoder:6,nvstreammux:6 && deepstream-app -c xxx.txt >2.log 2>2.log

ran with recommended settings and 2 rtsp sources and sinks, here are the logs with updated batched-push-timeout
logs.zip (6.9 MB)

[NOTE : this was with the new_nvstreammux enabled]

I checked the logs. resource utilization is not too high. packets receiving and decoding are fine.

  1. please run again with the same source and sink configurations, the use the following cmd to record two 264 files. please use localhost, not 127.0.0.1.
gst-launch-1.0 rtspsrc location=rtsp://localhost:8550/ds-test ! rtph264depay ! h264parse ! 'video/x-h264,stream-format=byte-stream' ! filesink location=test1.h264
gst-launch-1.0 rtspsrc location=rtsp://localhost:8552/ds-test ! rtph264depay ! h264parse ! 'video/x-h264,stream-format=byte-stream' ! filesink location=test2.h264
  1. if using with the same source configurations and filesink, could you share the two files recorded by filesink?
  2. about "using ffmpeg to generate an HLS format ", how did you use ffmpeg to generate?

Hi, only the first stream managed to generate some output , 2nd stream ended up giving this error: [this was without new_nvstreammux enabled and stream had been running for over 10 minutes]

gst-launch-1.0 rtspsrc location=rtsp://localhost:8553/ds-test ! rtph264depay !
 h264parse ! 'video/x-h264,stream-format=byte-stream' ! filesink location=test2.h264
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Progress: (open) Opening Stream
Pipeline is PREROLLED ...
Prerolled, waiting for progress to finish...
Progress: (connect) Connecting to rtsp://localhost:8553/ds-test
ERROR: from element /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0: Could not open resource for reading and writing.
Additional debug info:
../gst/rtsp/gstrtspsrc.c(8130): gst_rtspsrc_retrieve_sdp (): /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0:
Failed to connect. (Generic error)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ...

[Edit] when I restarted the stream with new_nvstreammux enabled i could export the videos but for stream1 i got this error:

Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Progress: (open) Opening Stream
Pipeline is PREROLLED ...
Prerolled, waiting for progress to finish...
Progress: (connect) Connecting to rtsp://localhost:8550/ds-test
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
Progress: (request) SETUP stream 0
Progress: (open) Opened Stream
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Progress: (request) Sending PLAY request
Redistribute latency...
Progress: (request) Sending PLAY request
Redistribute latency...
Progress: (request) Sent PLAY request
Redistribute latency...
Redistribute latency...
Redistribute latency...
Redistribute latency...
ERROR: from element /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstUDPSrc:udpsrc1: Internal data stream error.
Additional debug info:
../libs/gst/base/gstbasesrc.c(3127): gst_base_src_loop (): /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstUDPSrc:udpsrc1:
streaming stopped, reason not-linked (-1)
Execution ended after 0:00:27.384067521
Setting pipeline to NULL ...
Freeing pipeline ...

I cannot view the test1.h264 file even after converting to mp4, i could record 1 minute of the stream and then i stopped recording with ctrl + c, let me know if i can share the file in private, because i cannot disclose the contents of the files publicly. When I tried viewing test1.h264, I couldn’t view it even after trying to convert through ffmpeg.

  1. by filesink do you mean with mp4 format via deepstream filesink or gstreammer again?
  2. I am using these commands to use ffmpeg to convert rtsp output from deepstream to hls segments through a python script and stream it on the web with localhost for now.
cmd = [
            "ffmpeg",
            "-loglevel", "info",           
            "-fflags", "nobuffer",       
            "-flags", "low_delay",       
            "-i", url,                     
            "-c:v", "copy",             
            "-c:a", "aac",    
            "-f", "hls",
            "-hls_time", SEGMENT_TIME,
            "-hls_list_size", LIST_SIZE,
            "-hls_flags", "delete_segments+append_list+split_by_time+program_date_time+independent_segments",
            "-hls_allow_cache", "0",
            "-start_at_zero",
            playlist_path
        ]

with segment times and list size as 4 and 5 respectively, for 2-4 stream outputs from deepstream.