PyTorch for Jetson - version 1.11 now available

Hi @6200575, if you run g++ from your terminal, is it found on your system?

Can you run sudo apt-get install build-essential?


ERROR: torch-1.8.0-cp36-cp36m-linux_aarch64.whl is not a supported wheel on this platform.

Hi @hanyanfeng2012, are you trying to install it with pip3?

Is this the URL you downloaded the wheel from? https://nvidia.box.com/shared/static/p57jwntv436lfrd78inwl7iml6p13fzh.whl

Is that possible to support a higher version of python here, such as python 3.8 or 3.9?

Yes, you would need to build PyTorch from source. The instructions I used to built it for 3.6 are in the first post of the topic (so substitute pip3.8 for pip3, ect). There are various folks on this topic who have built it for 3.7/3.8, and there was another topic about it here:

I am trying to install the torch-vision but an error is showing up this error -->>

,
the above error is also shown when trying to import torch,
and the first line of setup.py is also import torch so import torch is a problem and my python version is 3.6.9 .

Hi @ashishad14, does the same thing happen if you just import numpy ?

If so, try setting export OPENBLAS_CORETYPE=ARMV8 beforehand
(Illegal instruction (core dumped) on import for numpy 1.19.5 on ARM64 · Issue #18131 · numpy/numpy · GitHub )

no numpy is simply getting imported

What is the actual error you are getting? I don’t see it in your post above.

Does it have error if you run this?

export OPENBLAS_CORETYPE=ARMV8
python3
>>> import numpy
>>> import torch
1 Like

[quote=“dusty_nv, post:769, topic:72048”]
export OPENBLAS_CORETYPE=ARMV8
[


but now I am not getting any error since I have been working on it past 8 hrs straight now I gave a break and power off the machine then the library is getting imported normally.
thanks
I will post if any further problem is faced.

OK gotcha, glad that it is working for you now.

1 Like

If you are building/installing torchvision into a python virtual environment (virtualenv), sudo is not required:

python3 setup.py install

Also, I noticed if you build torchvision into a python virtual environment for the current user ( (py36dl) me@jetsonnano:~/torchvision$ python3 setup.py install --user), python sometimes cannot locate the torchvision package, especially from a virtualenv or outside the user’s HOME directory. Modifying the PYTHONPATH is a possible workaround.

1 Like

Hi everyone,
earlier I was facing an issue of importing the pytorch in terminal, that problem got resolved and now I am facing problem on importing the pytorch in jupyter lab.


as the screen shots say the error is

OSError: /usr/lib/aarch64-linux-gnu/libgomp.so.1: cannot allocate memory in static TLS block

I ran over the same error. I got following from here which resolved this issue for me: export LD_PRELOAD=/usr/lib/aarch64-linux-gnu/libgomp.so.1

1 Like

yes tried it many times and then it shows up this box of kernel restarting as the kernel dies.

If you open a terminal from within Jupyter, and echo $LD_PRELOAD, does it show /usr/lib/aarch64-linux-gnu/libgomp.so.1?

I think this is due to the compatibility issue that is because of the jetpack version as u have mentioned in this (PyTorch for Jetson - version 1.8.0 now available)
that the latest version of PyTorch only works with the jetpack 4.5 and not with jetpack 4.5.1 which I currently mounted on my SD card.
Thank you @dusty_nv @dkreutz @kaisar.khatak

I don’t believe so, myself and others run those wheels on JetPack 4.4 / 4.5 / 4.5.1 without issue. The CUDA libraries are basically the same version between those JetPack releases so they can run the same wheel. Also, that intermittent issue with libgomp had appeared as far back as 2019 (not sure why)

Does the problem only occur for you in JupyterLab?

In an ordinary terminal (not in Jupyter), can you run:

$ export LD_PRELOAD=/usr/lib/aarch64-linux-gnu/libgomp.so.1
$ python
>>> import torch
1 Like

Hi @dusty_nv ,
Is there already a patch available to build LibTorch from source for JP4.5.1?
(Or is there no patching needed anymore?)

best regards,
fabi

Hi @fabian.groh, libtorch should automatically be built when you build the PyTorch wheel.

The patch to the PyTorch source typically depends on the version of PyTorch you are patching and not the version of JetPack. For example, the PyTorch v1.8.0 patch works on JetPack 4.4 - 4.5.1.