Onnxruntime error

Hi,

YES. It’s recommended to use TensorRT instead.
TensorRT is our library for fast inference and is also optimized for the Jetson platform.

You can test it with the following command directly:

/usr/src/tensorrt/bin/trtexec --onnx=[onnx/model]

Thanks.