PyTorch for Jetson Orin Jetpack 5.1 installation issue

I tried to install pytorch on Jetson Orin 64G developer kit, which has the jetpack 5.1 and cuda-toolkit 11.4 (default installation). By following instructions at:

  1. Download the package:
    wget https://developer.download.nvidia.com/compute/redist/jp/v511/pytorch/torch-2.0.0+nv23.05-cp38-cp38-linux_aarch64.whl
  2. Run the install:
    pip3 install torch-2.0.0+nv23.05-cp38-cp38-linux_aarch64.whl
    ERROR: torch-2.0.0+nv23.05-cp38-cp38-linux_aarch64.whl is not a supported wheel on this platform.

Please help on this issue. Thanks.

Hi,

We can install it without an error.
Could you share your system information with us?

$ pip3 --version
$ apt show nvidia-jetpack

Thanks.

Here is the information about the Orin: pip3, jetpack and cuda nvcc:

$ pip3 --version
pip 23.3.2 from /home/rzhang/anaconda3/lib/python3.11/site-packages/pip (python 3.11)

$ apt show nvidia-jetpack
Package: nvidia-jetpack
Version: 5.1-b147
Priority: standard
Section: metapackages
Maintainer: NVIDIA Corporation
Installed-Size: 199 kB
Depends: nvidia-jetpack-runtime (= 5.1-b147), nvidia-jetpack-dev (= 5.1-b147)
Homepage: Jetson - Embedded AI Computing Platform | NVIDIA Developer
Download-Size: 29.3 kB
APT-Manual-Installed: yes
APT-Sources: https://repo.download.nvidia.com/jetson/common r35.2/main arm64 Packages
Description: NVIDIA Jetpack Meta Package

$nvcc --version
nvcc: NVIDIA (R) Cuda compiler driver
Copyright (c) 2005-2022 NVIDIA Corporation
Built on Sun_Oct_23_22:16:07_PDT_2022
Cuda compilation tools, release 11.4, V11.4.315
Build cuda_11.4.r11.4/compiler.31964100_0

Thanks.

@Nqubits you have a conda environment active that is running Python 3.11, not the native version of Python 3.8 that these PyTorch wheels were built for on JetPack 5. Either disable anaconda or switch it to Python 3.8, or you can rebuild PyTorch from source for Python 3.11 by following instructions similar to this post:

I will try to disable anaconda environment, only use the python 3.8 to install jetson pytorch.
Since my orin is running jetpack 5.1, do I need to upgrade jetpack 5.1.2 in order to use:
https://developer.download.nvidia.com/compute/redist/jp/v512/pytorch/ [torch-2.1.0a0+41361538.nv23.06-cp38-cp38-linux_aarch64.whl ?

And, is pytorch 2.2 released?
[torch-2.2.0a0+6a974be.nv23.11-cp310-cp310-linux_aarch64.whl]

After disable the conda, it works:

$ pip3 --version
pip 20.0.2 from /usr/lib/python3/dist-packages/pip (python 3.8)

$ pip3 install torch-2.0.0+nv23.05-cp38-cp38-linux_aarch64.whl

Successfully installed MarkupSafe-2.1.4 filelock-3.13.1 jinja2-3.1.3 mpmath-1.3.0 sympy-1.12 torch-2.0.0+nv23.5 typing-extensions-4.9.0

Question: since anaconda manages the python packages very well, is there a workaround to install jetson pytorch in the conda environment, beside build pytorch from source codes?

Thanks.

I figure out the issue to install pytorch on jetson orin,
torch-2.0.0+nv23.05-cp38-cp38-linux_aarch64.whl
which has to use the python3.8 to install, since this wheel is in cp38 format built on the python3.8 environment.

After installation, the python environment can’t be upgraded to other version, such as 3.11. I suggest here:

  1. Update the installation guide to denote the python3.8 requirement.
  2. Find a best way to avoid this.

Thanks.

For the item 2, it can generate cp38/cp39/cp10/cp11 to support different python versions, like torch_cluster-* wheel packages:

Hi @Nqubits, we provide the prebuilt PyTorch binaries for the default version of Python that comes in JetPack. So for JetPack 5 that’s Python 3.8, and for JetPack 6 it’s Python 3.10. For other versions of Python, you would need to build PyTorch from source - I have a Dockerfile that I use for doing this here:

@dusty_nv This is great. One question here: does the Docker include the pytorch 2.0.0 patches as well? Thanks.

@Nqubits the patches for PyTorch 2.0 are commented out in there now that 2.1 is out, which included the fixes for sm87.

@dusty_nv Thanks a lot!

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.