RTSP vs video performance mismatch

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) Jetson Orin AGX 64 GB
• DeepStream Version 6.2
• JetPack Version (valid for Jetson only) 5.1
• TensorRT Version
• Issue Type( questions, new requirements, bugs) Using the same setup (models etc.) i get different performance results (max sources under a certain FPS threhsold) when using RTSP source vs video source.

My pipeline is: decoder - streammux - preprocess - pgie - tracker - sgie.
I run 8 video sources 25 FPS 1080p, I get > 25 FPS processing speed:

Next, i run 5 RTSP streams 25FPS, 1080p, 3072 kbps bitrate, H265 encoding, I frame interval 50; deepstream can process the 5 streams at 25 FPS:

However, it is enough to enable 6th RTSP stream and there is a huge performance drop:

So 2 things I cannot understand:

  1. Why can’t I get the same FPS with RTSP compared to video? Is there a higher load on the system when decoding RTSP? Models shouldn’t be an issue here, because as I said, 8 videos run just fine at 25 FPS.
  2. Why there is a massive performance drop going from 5 to 6 RTSP streams?

What I was able to find out:

  1. Before I could not even get 5 RTSP streams at 25 FPS, but using select-rtp-protocol=4 under all [source] and using batched-push-timeout=30000 under [streammux] instead of 40000 helped.

Attaching config.txt for RTSP and video setup.
config_rtsp.txt (2.4 KB)
config_video.txt (1.1 KB)

I am at a dead end here really, tried out many things (again, setup is exactly the same, the only difference is source type), would appreciate any ideas! Perhaps, I hit some network banwidth limitation? How could I check/verify this?

Live streams may be affected by the ethernet transferring efficiency. Local file can be handled as soon as possible if the hardware is fast enough.

You may enable the GPU and CPU performance monitor during running the cases.


Perhaps the iPerf tool can help you to get more network status. https://iperf.fr/

Thank you, will try iPerf and get back to you.