False when running torch.cuda.is_available()

Hi,

I am working on an NVIDIA Jetson orin nano dev kit. I have installed the latest firmware, 36.4.3 with Jetpack 6.2.

And I tried installing torch and torchvision using this below command
pip install --pre torch torchvision torchaudio --extra-index-url https://developer.downloa/jp/v62

However, when I run torch.cuda.is_available(), it returns False.

More details:

sudo apt show nvidia-jetpack
Package: nvidia-jetpack
Version: 6.2+b77
Priority: standard
Section: metapackages
Source: nvidia-jetpack (6.2)
Maintainer: NVIDIA Corporation
Installed-Size: 199 kB
Depends: nvidia-jetpack-runtime (= 6.2+b77), nvidia-jetpack-dev (= 6.2+b77)
Homepage: Jetson - Embedded AI Computing Platform | NVIDIA Developer
Download-Size: 29.3 kB
APT-Sources: https://repo.download.nvidia.com/jetson/common r36.4/main arm64 Packages
Description: NVIDIA Jetpack Meta Package

uname -a:
Linux Jetson 5.15.148-tegra #1 SMP PREEMPT Tue Jan 7 17:14:38 PST 2025 aarch64 aarch64 aarch64 GNU/Linux

lsb_release -a:
No LSB modules are available.
Distributor ID: Ubuntu
Description: Ubuntu 22.04.5 LTS
Release: 22.04
Codename: jammy

Hi,

You could execute below command to install torch-2.6 and visio-0.21.0

wget https://pypi.jetson-ai-lab.dev/jp6/cu126/+f/6cc/6ecfe8a5994fd/torch-2.6.0-cp310-cp310-linux_aarch64.whl#sha256=6cc6ecfe8a5994fd6d58fb6d6eb73ff2437428bb4953f3ebaa409f83a5f4db99
pip install torch-2.6.0-cp310-cp310-linux_aarch64.whl
wget https://pypi.jetson-ai-lab.dev/jp6/cu126/+f/aa2/2da8dcf4c4c8d/torchvision-0.21.0-cp310-cp310-linux_aarch64.whl#sha256=aa22da8dcf4c4c8dc897e7922b1ef25cb0fe350e1a358168be87a854ad114531
pip install torchvision-0.21.0-cp310-cp310-linux_aarch64.whl

Thanks

Thanks, it worked.

I tried running the command “nvidia-smi” to monitor the GPU usage. However, I see “No running processes found” is there any way to monitor the GPU usage?

Hi,

nvidia-smi not supported in jetson.
using tegrastats or install jtop instead.

Thanks

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.