PyTorch for Jetson - version 1.10 now available

@dusty_nv

I am having this same problem (gcc segfault on pybind_state.cc) when trying to build with python-3.6 from source.
pytorch 1.8.1 and 1.9.0 show the same failure.

Not sure where to go from here, but can’t get pytorch to build on a jetson.

I’ve gotten past the pybind_state.cc segfault (and a second later one) with the following patch:

pytorch.1.9.0-jetpack-4.6.0.second.patch (1.3 KB)

The patch just lowers a few files from -O3 to -O1 optimization. Hopefully the performance impact is negligible.

Environment (some packages pulled from focal) is now:
CMake version : 3.10.2
C++ compiler version : 7.5.0
TORCH_VERSION : 1.9.0
Python version : 3.8.10

Although it will likely work with the vanilla JetPack install.

Hi @dusty_nv
I need to download whl file of torch and the download is very much slow…i have checked my internet speed but nothing helped… Kindly resolve this issue asap please…

Hi @bhardwajbhaskar7488, here is a Google Drive link to the PyTorch wheels. Does it download any faster for you?

https://drive.google.com/drive/folders/1PGiK_Kgfs7G3xhFV00mOik17wXm7WSGW?usp=sharing

468bc45fe21f7e57c685fd5d264081b

@junxing.liang please keep discussion in the other topic that you have opened, thank you.

The default version of python shipped with Ubuntu is no longer 3.6 but 3.7 and higher. Previous comments have stated when py3.6 is no longer the default version shipped with Ubuntu we will build wheels for the next version of python. Where are the python3.7 wheels for the nano 2gb. I do not see them anywhere on the site and compiling pytorch on the nano takes well over 14 hours. A second question is do the wheels built by pytorch work on the nano 2gb. If so perhaps we can just use those.

Correct me if I’m wrong, but the default version of Python that ships with Ubuntu 18.04 (which is the version of Ubuntu that Nano uses) is still Python 3.6.

Do you mean the ones from pytorch.org? The ARM/aarch64 wheels from pytorch.org can technically run on Jetson, they were just not built with CUDA/GPU acceleration (CPU only)

Thank you Dusty for the speedy reply. Yes, those are the ones I mean. That’s horrible news. I love my new nano but I can’t get pytorch compiled for python3.7 and the tts engine I am trying to run requires numba which now requires python > 3.6. I am migrating our app from the pi4 mainly because of the gpu so cpu only support does not work for me. My pytorch build broke (on the nano after about 14 hours) but even if it compiled I doubt it would have worked because of the register issues, compiler options, gpu specifics, code mods (which i don’t mind btw) and general lack of sound documentation. I would have no problem building a wheel for 3.7, 3.8 and 3.9 if someone could show me some pointers beyond this doc which I followed but did not work. qengineering.eu/install-pytorch-on-jetson-nano.html, maybe send me your make file - i don’t mind running the build on the nano overnight. Anyway, thanks again for the rapid reply.

@ken.smith there are also build instructions (for Python 3.6) and patches at the top of this thread under the Build from Source section, and you may find this topic about building PyTorch for Python 3.8 useful:

Thank you again for your rapid response. I will follow those instructions and try it with python3.8 which seems to have worked for him. Is there anything special I have to set to get the GPU (not CPU) version built or will it pick it up from the environment.

There are some environment variables that you should set that you can find in the Build from Source section at the top of this topic. Also since you are on Nano 2GB I would mount a substantial amount of swap memory too.

Thank you. I have configured 4gb of swap. I hope that is enough. I am following the 3.8 walk through and I will also look for additional variable in the Build From Source post above. If it works I can send you the wheels.

May I check if Pytorch v1.5.0: Python 3.6 - torch-1.5.0-cp36-cp36m-linux_aarch64.whl

can work on Jetpack 4.2?

Likewise, can I check if cp36-cp36m can only work with python3.6? If not, are there cp37 versions available?

I don’t believe so, it was built for JetPack 4.4. You could try compiling PyTorch 1.5 from source on JetPack 4.2, but I’m not entirely sure if the versions of CUDA/cuDNN are compatible.

The cp36 wheels only work with Python 3.6. For Python 3.7/ect you would need to build PyTorch from source. See the couple posts above yours for some additional info about it.

I cannot reach ftp sites from behind corporate proxy (blocked). Is there any other way to wget packages?

It built (took about 40 hours) but now when I run pytorch I see this …

ken@ubuntu:~/Pytorch1.7/pytorch$ python3.7
Python 3.7.12 (default, Oct 12 2021, 22:22:57)
[GCC 7.5.0] on linux
Type “help”, “copyright”, “credits” or “license” for more information.

import torch
Traceback (most recent call last):
File “”, line 1, in
File “/home/ken/Pytorch1.7/pytorch/torch/init.py”, line 218, in
‘’’).strip()) from None
ImportError: Failed to load PyTorch C extensions:
It appears that PyTorch has loaded the torch/_C folder
of the PyTorch repository rather than the C extensions which
are expected in the torch._C namespace. This can occur when
using the install workflow. e.g.
$ python setup.py install && python -c “import torch”

This error can generally be solved using the `develop` workflow
    $ python setup.py develop && python -c "import torch"  # This should succeed
or by running Python from a different directory.

Is there something else I need to run? I do not see pytorch in my site-packages.

@cmtrhnn can you access Google Drive to download the wheels? I have a mirror here:

https://drive.google.com/drive/folders/1PGiK_Kgfs7G3xhFV00mOik17wXm7WSGW?usp=sharing

Did you build the wheel and then install the wheel?

Make sure you aren’t trying to import torch from the directory where the source code is located.

Ah, my mistake. I did not realize that. So after I compiled pytorch on the nano 2gb i then ran setup.py bdist_wheel. that created a wheel for python3.7, pytorch1.7 under the dist/ directory. I then installed the wheel, moved out of the development directory and everything worked great. I can import torch and it shows cuda is enabled. I can share the wheel if you like but not sure how to put it anywhere as it is pretty big. Thanks for all your help Dusty!