Gst-nvinfer plugin’s Gst properties: What is "batch-size"?

• Jetson nano
• DeepStream Version: 5.0.1
• JetPack Version: 4.4
• TensorRT 7.0.1
• Issue Type( questions)

Hi ,
I have a couple questions about the “batch-size” configuration in this link(Gst-nvinfer — DeepStream 6.0.1 Release documentation)

  1. Are there any indicators when considering this batch-size, because I’ve changed the value but I couldn’t find any difference in the results.(So far I’m using 1 as my default)

  2. Does increasing the batch-size usually improve performance?

  3. Also does this mean, for example, if there are 4 frames with 3 objects each, and I run with batch size=1, it will only infer 1 frame(with all 3 objects) in a batch?

Batch is a very common concept in machine learning. Machine learning inference during deployment - Cloud Adoption Framework | Microsoft Docs

  1. gst-nvinfer batch-size is a parameter to describe the inference model run with gst-nvinfer. It means the max batch-size of your model. The max batch size of the model is decided when the model is generated(it has nothing to do with deepstream, you need to consult the person who train and generate the model)
  2. It depends on the model and the data provided from upstream. If the data from nvstreammux only has batch size 1, it is no use to increase nvinfer batch size. Frequently Asked Questions — DeepStream 6.0.1 Release documentation
  3. If you batch the 4 frames with nvstreammux into one batch, and your inference model support max batch size is no less than 4, the pipeline can infer 4 frames in a batch.

Hi Fiona,

Thank you for replying and helping me out.

I have another question for 2.
Does that mean I should set the same batch size for nvinfer and nvstreammux…?

For common deepstream cases, if there is only PGIE, the answer is yes.

Hi Fiona,

Thank you for replying.
How should you set the batch size for nvinfer if there is a secondary GIE?

For example,
4 sources(frames?) as an input

batch size for nvstreammux would be okay to set as 4

primary AND a secondary GIE

batch size for nvinfer would be 8…?

Depends on your SGIE model max batch size and the pool size of PGIE. Please refer to the source code of gst-nvinfer.

Hi Fiona,

Got it,
Thank you for replying and helping me out.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.