Batch size adjustment

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU)
• DeepStream Version
• JetPack Version (valid for Jetson only)
• TensorRT Version
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs)
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

Hi, I am running on the License Plate Recognition Deepstream SDK. I am running on 1 stream with the following batch size configuration:
Batch size for Nvstreammux = 1
Batch size for pgie = 1
Batch size for sgie = 16
Batch size for sgie2 = 16

If i would like to increase my rtsp source from 1 to 10, is it optimal for me to just increase all the batch size by multiplying them 10? Please advise

The nvstreammux batch size should be the same as your source number. For the model max batch sizes, they depend on your device and use case. If there are not that much cars in your case, you can keep the batch size of nvinfer as your original settings.

Besides the batch size, any other configurations is needed to take into account for multistream? E.g configuration for other plugins such as the decoder, etc.

https://docs.nvidia.com/metropolis/deepstream/dev-guide/text/DS_FAQ.html#what-is-the-difference-between-batch-size-of-nvstreammux-and-nvinfer-what-are-the-recommended-values-for-nvstreammux-batch-size

DeepStream SDK FAQ - Intelligent Video Analytics / DeepStream SDK - NVIDIA Developer Forums

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.