Best way to install LibTorch?


What is the best/easiest way to install LibTorch so that I can use it in C++ on a Jetson AGX Xavier? I am currently using the CPU version from pytorch website so torch::cuda::is_available() returns false at the moment. (CUDA 10.2 is installed if I check in the terminal). The CUDA version on is unfortunately not built for aarch64 so I need to build it on my device I suppose? Or do you provide a pre-built aarch64 version of libtorch somewhere that I can just drag and drop onto my jetson?

Thankful for any help I can get.


We do have the prebuilt PyTorch package for the Jetson user.
You can find the package here or a container here.


Hi thanks for the response,

Unfortunately I run into this error after following the installation instructions and verify using python:

python3 -c 'import torch'
OSError: cannot open shared object file: No such file or directory
sudo apt-get install libopenblas-base libopenmpi-dev libomp-dev
libomp-dev is already the newest version (1:10.0-50~exp1).
libopenmpi-dev is already the newest version (4.0.3-0ubuntu1).
libopenblas-base is already the newest version (0.3.8+ds-1ubuntu0.20.04.1).


Also tried to run the docker image (Using jetpack 4.6.1) and no luck:

sudo docker run -it --rm --runtime nvidia --network host
docker: Error response from daemon: failed to create shim: OCI runtime create failed: container_linux.go:380: starting container process caused: error adding seccomp filter rule for syscall clone3: permission denied: unknown.

Can you try following these steps to restore your docker install? Docker fails to create container after upgrading docker on Jetpack 4.9 · Issue #108 · dusty-nv/jetson-containers · GitHub

Is this on JetPack 5.0 DP? Because those package versions are for focal (20.04) not bionic (18.04)

On JetPack 5.0, I have the same versions installed and don’t get the MPI library error. You might want to try installing my PyTorch 1.11 wheel instead and see if that helps.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.