Invalid argument: Input shapes are inconsistent on the batch dimension, for TRTEngineOp_0(for secondary inference))

Please provide complete information as applicable to your setup.

• Hardware Platform: RTX 2080
• DeepStream Version: 5.0
• NVIDIA GPU Driver Version: 440.33.01
Hi,
I wanted to run my primary and secondary models adding parameters:

optimization { execution_accelerators {
gpu_execution_accelerator : [ {
name : “tensorrt”
parameters { key: “precision_mode” value: “FP16” }
parameters { key: “max_workspace_size_bytes” value: “512000000”}
}]
}}

Both the models work fine without the above added accelerates, with rest of the config and pipeline untouched. In both the cases(with and without the portion of config added)the primary model has the batch size set to 2 and the secondary model with batch size set to 16 given an error.

The primary model runs fine with better throughput but if the accelerator set for secondary models gives error:

Invalid argument: Input shapes are inconsistent on the batch dimension, for TRTEngineOp_0.