How can i run onnxruntime C++ api in Jetson OS?

Description

how can i run onnxruntime C++ api in Jetson OS ?

Environment

TensorRT Version: 10.3
GPU Type: Jetson
Nvidia Driver Version:
CUDA Version: 8.0
Operating System + Version: Jetson Nano
Baremetal or Container (if container which image + tag): Jetpack 4.6

i installed python onnx_runtime library but also i want to run in onnx_runtime in c++ api.
so how can i build onnx_runtime c++ api? (there are no documents about this problem…]

Hi,
Request you to share the ONNX model and the script if not shared already so that we can assist you better.
Alongside you can try few things:

  1. validating your model with the below snippet

check_model.py

import sys
import onnx
filename = yourONNXmodel
model = onnx.load(filename)
onnx.checker.check_model(model).
2) Try running your model with trtexec command.
https://github.com/NVIDIA/TensorRT/tree/master/samples/opensource/trtexec
In case you are still facing issue, request you to share the trtexec “”–verbose"" log for further debugging
Thanks!

i just want to know how i can build onnx library and use <onnxruntime_cxx_api.h> in jetson board.

thanks @NVES

Hi,

We are moving this post to Jetson forum to get better help.

Thank you.

Hi,

You can find the prebuilt package for Jetson on the below link directly.

https://elinux.org/Jetson_Zoo#ONNX_Runtime

The onnxruntime *.so will be installed after the pip3 command:

$ find / -iname libonnxruntime*
/usr/local/lib/python3.6/dist-packages/onnxruntime/capi/libonnxruntime_providers_cuda.so
/usr/local/lib/python3.6/dist-packages/onnxruntime/capi/libonnxruntime_providers_shared.so
/usr/local/lib/python3.6/dist-packages/onnxruntime/capi/libonnxruntime_providers_tensorrt.so

Thanks.

1 Like

thanks @AastaLLL ~!!!

Hi,
i have the same problem: try to load ONNX model on jetson, i did as this answer and found libonnxruntime_providers_cuda.so, libonnxruntime_providers_shared.so and libonnxruntime_providers_tensorrt.so as you said. However i still am missing appropriate libonnxruntime.so and libonnxruntime.so.1.12.0 and also onnxruntime_c_api.h
where can I find them?
P/s: i found on the release page of ONNX : [onnxruntime-linux-aarch64-1.11.0.tgz], which works on jetson nano with all files but only CPU and very slow. any combination of this release and the files from the answer cause core dump error

2 Likes

Hi user165035,
Please help to open a new topic. Thanks