Hello,
I installed onnxruntime-gpu within my Jetson nano 2GB kit using ‘python3.8 -m pip install onnxruntime_gpu-1.6.0-cp38-cp38-linux_aarch64.whl’
While trying to import within python3.8 environment, getting the below error:
import onnxruntime
Traceback (most recent call last):
File “”, line 1, in
File “/home/bidyut/.local/lib/python3.8/site-packages/onnxruntime/init.py”, line 56, in
raise import_capi_exception
File “/home/bidyut/.local/lib/python3.8/site-packages/onnxruntime/init.py”, line 23, in
from onnxruntime.capi._pybind_state import ExecutionMode # noqa: F401
File “/home/bidyut/.local/lib/python3.8/site-packages/onnxruntime/capi/_pybind_state.py”, line 32, in
from .onnxruntime_pybind11_state import * # noqa
ImportError: /lib/aarch64-linux-gnu/libm.so.6: version `GLIBC_2.29’ not found (required by /home/bidyut/.local/lib/python3.8/site-packages/onnxruntime/capi/onnxruntime_pybind11_state.so)
How can I use GPU to run .onnx model? Kindly suggest me.
ONNX Runtime 1.6.0 sounds pretty old. Are you able to upgrade to a more recent release or compile from source on your platform ? The issue sounds a lot like your Linux version has a different C standard library than the one ONNX Runtime was compiled with.
Hi Max,
many thanks for your kind attention. I have tried to upgrade it to onnxruntime 1.11 but no luck. I think issue is GLIBC version. I have 2.27, but it needs 2.29. My intention is to use GPU to run .onnx models. I am working on NVIDIA Jetson Nano 2gb.
Cuda 10.2
Jetpack 4.6
cuDNN 8.2
Kindy suggest me the best way to install onnxruntime-gpu!
One way is to upgrade the jetpack and with that the operating system to get to a newer C lib version. The only other alternative is to compile from source and link against your current C lib version.
But Max, my question is: will it be possible to install later version of jetpack into NVIDIA Jatson Nano 2gb?
Probably no, as this Jetson has reached end of life from my quick research. I am not a Jetson expert.
Yes true. Is there any other way to install onnxruntime-gpu into it then?
I am sorry, I really didn’t understand this option. :(