How to convert ONNX models to .plan?

Hi

I have successfully converted some models (InceptionV1, V3 and V4) to ONNX using tf2onnx.

How can I convert them to .plan? Where can I find an example?

I am using TensorRT 6.0.1 and onnx=1.7

Hi,

You can use it by our trtexec binary directly:

/usr/src/tensorrt/bin/trtexec --onnx=[model]

Thanks.

If I use “trtexec” to generate the .plan/.trt file, will I still be able to execute the file like it has been done in this GIT: NVIDIA-AI-IOT o something similar?

I am doing a benchmark between TX2 and Coral. So I need to specify the number of inferences, the time of inferences, etc.

Hi,

You can use trtexec for benchmarking directly.
Please use --help to check the support parameter.

Another alternative is to serialized the TensorRT into file via --saveEngine=[file/name].
Then you can feed it into the NVIDIA-AI-IOT with plan_filename=[file/name].

Thanks.

1 Like