Batch inference at secondary inference

How can I make batch of image for sgie in deepstream?
I can’t find a place to make batch of images. The secondary gie is its shape in input as

0 INPUT kFLOAT input:0 24x94x3 min: 1x24x94x3 opt: 10x24x94x3 Max: 10x24x94x3

I have error at secondary gie

ERROR: [TRT]: Flatten/flatten/Reshape: reshaping failed for tensor: Transpose__50:0
ERROR: [TRT]: shapeMachine.cpp (154) - Shape Error in executeReshape: reshape dimension of -1 has no solution
ERROR: [TRT]: Instruction: RESHAPE_ZERO_IS_PLACEHOLDER{3 1 88 48} {10 -1}
ERROR: Failed to enqueue trt inference batch

In {3 1 88 48}, 1 is batch, I think.
Where should I check for the error?

Found out that we just create network in dynamic shape at network’s input, then secondary gie’s batch size is changed according to detection numbers at pgie’s output. We just need to create dynamic reshape tensorrt plugin using IPluginV2DynamicExt if own cuatomized plugin layer is required. So, that means deepstream will take care of creating dynamic reshape tensorrt algorithm inside.

1 Like

Please refer Explicit Full Dimension Network Support in https://docs.nvidia.com/metropolis/deepstream/plugin-manual/index.html#page/DeepStream%20Plugins%20Development%20Guide/deepstream_plugin_details.html#wwpID0E0YFB0HA