Manage the bach size

• Hardware Platform Jetson Orin AGX Development Kit
• DeepStream Version 6.3
• JetPack Version 5.1.2

In my DeepStream-based system, which count people and includes detection, tracking, and classification stages, I encountered a warning:

“Warning: gst-core-error-quark: A lot of buffers are being dropped. (13): gstbasesink.c(3003): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstNv3dSink:nv3dsink-0: There may be a timestamping problem, or this computer is too slow.”

To address this issue, I attempted to adjust the batch size in the infer-config of nvstreamdemux. However, despite increasing the batch size, I did not observe any improvement in execution time and in the results of the system, and I continued to encounter another warning:

“WARNING: Overriding infer-config batch-size 5 with number of sources 2.”

My question is whether increasing the batch size value will enable the system to process multiple frames simultaneously, or if the DeepStream system mandates the batch size always match the number of source streams.

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks

If the GPU load is high due to the inference(nvinfer), adjusting the batch size of nvstreammux may have little effect. You can try to adjust the interval paramter in nvinfer and see if that works.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.