• Hardware Platform (Jetson / GPU) GPU
• DeepStream Version 7.0 (docker image: nvcr.io/nvidia/deepstream:7.0-triton-multiarch )
• NVIDIA GPU Driver Version (valid for GPU only) 535.171.04
Hi,
I am currently working with the deepstream_parallel_inference_app and trying to process multiple RTSP sources that have different frame rates (FPS).
I’ve tested this two configurations:
- bodypose_yolo_lpr/source4_1080p_dec_parallel_infer.yml
- vehicle_lpr_analytic/source4_1080p_dec_parallel_infer.yml
In both cases, I enabled only sink0
(EglSink), and modified the streammux
properties to utilize a new streammux configuration:
streammux:
batch-size: 4
## Set muxer output width and height
width: 1920
height: 1080
config-file: config_new_streammux.txt
For the sources, I used the following source.csv
file:
enable,type,uri,num-sources,gpu-id,cudadec-memtype
1,4,rtsp://localhost:8554/stream_0,1,0,0
1,4,rtsp://localhost:8554/stream_1,1,0,0
1,4,rtsp://localhost:8554/stream_2,1,0,0
1,4,rtsp://localhost:8554/stream_3,1,0,0
stream_0
andstream_1
are running at 30 FPSstream_2
andstream_3
are running at 20 FPS
Here’s the content of my config_new_streammux.txt
file:
[property]
adaptive-batching=1
## Set to maximum fps
overall-min-fps-n=30
overall-min-fps-d=1
## Set to ceil(maximum fps/minimum fps)
max-same-source-frames=2
[source-config-0]
## Set to ceil(current fps/minimum fps)
max-num-frames-per-batch=2
[source-config-1]
## Set to ceil(current fps/minimum fps)
max-num-frames-per-batch=2
[source-config-2]
## Set to ceil(current fps/minimum fps)
max-num-frames-per-batch=1
[source-config-3]
## Set to ceil(current fps/minimum fps)
max-num-frames-per-batch=1
The problem I’m facing is that, after processing a few frames, the pipeline freezes—nothing gets processed, and the GPU usage drops to 0%.
Questions:
- Is there a known issue with the
deepstream_parallel_inference_app
when handling multiple RTSP sources with different frame rates? - Could the configuration for
streammux
be causing the freeze? - Any suggestions or potential fixes for handling this scenario?
Thank you for your help!