How to convert ONNX models to .plan?


I have successfully converted some models (InceptionV1, V3 and V4) to ONNX using tf2onnx.

How can I convert them to .plan? Where can I find an example?

I am using TensorRT 6.0.1 and onnx=1.7


You can use it by our trtexec binary directly:

/usr/src/tensorrt/bin/trtexec --onnx=[model]


If I use “trtexec” to generate the .plan/.trt file, will I still be able to execute the file like it has been done in this GIT: NVIDIA-AI-IOT o something similar?

I am doing a benchmark between TX2 and Coral. So I need to specify the number of inferences, the time of inferences, etc.


You can use trtexec for benchmarking directly.
Please use --help to check the support parameter.

Another alternative is to serialized the TensorRT into file via --saveEngine=[file/name].
Then you can feed it into the NVIDIA-AI-IOT with plan_filename=[file/name].


1 Like