Keras->ONXX->TensorRT

Ah, sorry - I used trtexec --explicitBatch --onnx=model.onnx, which I believe creates a dummy profile for batch size 1, if not specified, by default for the sake of parsing.

You can also reference this thread for how to create profiles using trtexec if interested: TensorRT 7 ONNX models with variable batch size - #6 by NVES_R


For doing this with the Python API, you could reference this script: https://github.com/rmccorm4/tensorrt-utils/blob/master/classification/imagenet/onnx_to_tensorrt.py, which will create some default optimization profiles for various batch sizes.

It should work with something like:

python3 onnx_to_tensorrt.py --explicit-batch --onnx=model.onnx 

You can tweak/use parts of the code for your needs, you definitely don’t have to use it as is.