Cuda provider when using onnx

Hi everyone,

Im having difficulties on adding cuda provider while using onnx on python3.6.

it worked perfectly fine on my jetson nano without install anything spcial (except the jetpack)

anyone know what the problem might be? i tried adding the path into the project variables but maybe im doing somthing wrong?

thanks.

Hi,

It works in our environment. Here are the details:
Install onnxruntime 1.6.0 with the instructions below:
https://elinux.org/Jetson_Zoo#ONNX_Runtime

And test:

$ python3
Python 3.6.9 (default, Oct  8 2020, 12:12:24) 
[GCC 8.4.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import onnxruntime as rt
>>> sess = rt.InferenceSession("/usr/src/tensorrt/data/mnist/mnist.onnx")
>>> sess.set_providers(['CUDAExecutionProvider'])

Thanks.

thanks!
I build a new virtual env and re-install it from this link and it works!