Batch size is smaller than number of streams in DS pipeline

• Hardware Platform (Jetson / GPU):jetson
• DeepStream Version:5.0 ga
• JetPack Version (valid for Jetson only):4.4
• TensorRT Version:7.1

Hi guys,
Suppose I have 20 RTSP streams, and I can’t passed all of streams (batch-size=20) into nvinfer to the model at the same time, I want to pass every time 5 out of 20 streams passed into nvinfer, Is it possible? If I set batch size of nvinfer to 5 then It passes batch of 5 streams passed into model every time? my mean is that It pass 1-5 streams and then 6-10 streams then 11-15 then 16-20 streams into model?

No. Deepstream does not support it. There are several “batch-size” parameters in deepstream, each has its own meaning, but they do not mean the number of streams even it is better to equal to streams number.
The streams in the same pipeline will be handled at the same time. You may need to handle different streams in different pipeline if you don’t want them be handled at the same time.

@Fiona.Chen, I don’t want to consider other pipeline, because in this way the model is loaded two times in memory.

If I set nvstreammux’s batch-size=20 and nvinfer’s batch-size=5, Is it possible to nvinfer process all of streams with batch-size=5? If no, So the nvinfer with batch-size=5 which of streams (out of 20) doing to process?

The nvstreammux batch-size and nvinfer batch-size can be different, but the performance will be different with different batch-size.

If set nvstreammux’s batch-size=20 and nvinfer’s batch-size=5, 20 of the frames from the 20 streams will be handled in nvinfer, nvinfer needs to infer 20/5=4 times to finish the 20 frames, then the next 20 frames will be handled.