Batch processing with Gst-nvstreammux New (Beta)

• Hardware Platform GPU
• DeepStream Version 6.0.1
• TensorRT Version 8.2.4.2-1
• NVIDIA GPU Driver Version 470.103.01
• Issue Type: questions and errors

I use DeepStream SDK with python and try to handle a big number of RTSP streams (>100), with next pipeline:

rtspsrc ! rtph264depay ! h264parse ! nvv4l2decoder ! nvvideoconvert ! capsfilter ! queue ! nvstreammux ! nvvideoconvert ! nvinfer ! nvtracker ! fakesink.

I use Gst-nvstreammux New (Beta) with adaptive batching with next config:

[property]
algorithm-type=1
adaptive-batching=1
max-fps-control=1
overall-max-fps-n=6
overall-max-fps-d=1
overall-min-fps-n=4
overall-min-fps-d=1
max-same-source-frames=1
  1. First question related to the errors in the logs (despite the errors, pipeline works):
[Error while parsing streammux config file: Key file does not have key “enable-source-rate-control” in group “property”]
[Error while parsing streammux config file: Key file does not have key “batch-size” in group “property”]

There is no information in the docs about enable-source-rate-control property.
And why does it require batch-size while I use adaptive batching?

  1. nvinfer also has batch-size property with properly created tensorrt model for such batch-size.
    According to the fact that RTSP is not stable, I have cases when streammux collects not full batch. How does nvinfer work in that situation? For example, I have batch-size: 20, but collected only 15 frames. (I can’t wait to long to collect frames from all sources, because it will reduce the latency)

1 enable-source-rate-control is internal strategy, default is false. adaptive batching has higher priority, if both are set, batch size will use adaptive batching. please refer to Gst-nvstreammux New — DeepStream 6.1.1 Release documentation

2 it depends on streammux configuration, like sync-inputs, max-latency,batched-push-timeout. please refer to Gst-nvstreammux New — DeepStream 6.1.1 Release documentation

There is no update from you for a period, assuming this is not an issue anymore.
Hence we are closing this topic. If need further support, please open a new one.
Thanks

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.