Installing onnxruntime inside Jetson-containers docker container Jetpack 6.2.1

So I’m having a lot of trouble figuring out how to get the onnxruntime wheel to install inside my l4t-pytorch docker container from Dusty-NV’s github page.

Things I tried:

wget https://pypi.jetson-ai-lab.dev/jp6/cu126/+f/e1e/9e3dc2f4d5551/onnxruntime_gpu-1.23.0-cp310-cp310-linux_aarch64.whl

This just continues to try and connect over and over and retries forever

pip3 install onnxruntime-gpu==1.23.0

Same thing happens when trying this

Is the website down? Sometimes I’m able to connect to it on my main pc but I also get a something went wrong somtimes. If anyone has any fix please let me know. Additionally, could I just use this container instead and get most of the same packages from l4t-pytorch? jetson-containers/packages/ml/onnxruntime at master · dusty-nv/jetson-containers · GitHub

Thank you!

Why does it keep resorting to pypi.jetson-ai-lab.dev instead of .io? .dev is down so of course this will fail

I followed this and got it to work. Just pasted it and reran the command previously

Hi,

You can find the packages for JetPack 6.2.1 below:

Thanks.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.