TensorRT 5.0.2 Batch Size Problem: bigger batch size Inference Time increase???

Hi,

Batch size indicates the different input number.
For an input tensor=(N,C,H,W), batch size change the value of N.

Take image case as example,
Batch size equals to 1 → inference one image per time.
Batch size equals to 2 → you inference two image per time.

Since the computational works is proportional to N, the execution time will increase when N becomes bigger.
In general, the execution time will follow this:

T(N=1) < T(N=k) < k*T(N=1)

Thanks.