Deepstream Batch Size > 1 Downstream Issue

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) GPU
• DeepStream Version 7.1
• JetPack Version (valid for Jetson only)
• TensorRT Version 10.5.0.18
• NVIDIA GPU Driver Version (valid for GPU only) 560.35.03
• Issue Type( questions, new requirements, bugs) Bug
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

Pipeline:
nvurisrcbin -> nvstreammux -> nvinfer -> nvtracker -> nvdspreprocess

Description:
I’m encountering a batch-related issue with my pipeline. I’ve implemented a custom transform function in the nvdspreprocess plugin. However, within the top-level function (called by the preprocessing plugin), the NvDsPreProcessBatch *batch argument has a mismatch:

  • batch->units.size() == 1
  • batch->batch_meta->num_frames_in_batch == 2 (as expected, with batch_size == 2 set in the upstream nvstreammux).

Interestingly, the issue resolves when I set batch_size == 1 in nvstreammux, but this underutilizes resources.

Has anyone encountered this issue, or can provide guidance on why the batch->units.size() does not align with num_frames_in_batch?

How do you get the batch->units.size()? Could you add some logs in our open source gst_nvdspreprocess_on_frame in the deepstream\sources\gst-plugins\gst-nvdspreprocess\gstnvdspreprocess.cpp to debug that yourself?

I am printing batch->units.size() within my custom transform function within the preprocessing plugin. I will be investigating further today and update.

Fixed by adjusting the src-ids listed within the only group of the preprocess config from implicit i.e. ‘-1’ all sources to explicitly listing all sources ids.

Glad to hear that. If you have any new questions, you can file a new topic.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.