Using Pytorch on GPU after upgrading cuda to 11.8 on Jetpack5

I managed to upgrade CUDA to 11.8 on AGX Xavier with JetPack 5.1 inside a container . but after that, I could not use Pytorch on GPU as torch.cuda.is_available() returns False. Any suggestions?

@YoushaaMurhij you might want to try re-installing the PyTorch 2.0 wheel that has CUDA support from here:

Or baring that, rebuild PyTorch from source for CUDA 11.8 following the Build from Source instructions from that thread. I haven’t tried upgrading CUDA with PyTorch before, so YMMV. Also if/when you go to build torchvision, the CUDA versions will need to match otherwise PyTorch will throw an error about it.

@dusty_nv, Thanks for you response. I have already tried to install PyTorch from this whl: - torch-2.0.0+nv23.05-cp38-cp38-linux_aarch64.whl. But the same problem is still there. I will try to install PyTorch from source.

Installing PyTorch from source does not solve the problem.
i wonder if I need to upgrade Cuda version on the host (Jetson Xavier with Jetpack 5) before trying to upgrade it inside a container! or does the l4t container just have a direct access to the cuda which is installed on the host?

@YoushaaMurhij on JetPack 5, CUDA/cuDNN/TensorRT/ect are installed inside the container (as opposed to being mounted from your device, like they were on JetPack 4). I haven’t attempted to upgrade CUDA and try it with PyTorch, so I’m not sure what the proper sequence is to try.

There is a working Cuda 12 for Jetson available from here : CUDA Toolkit 12.0 Downloads | NVIDIA Developer
I had to use at least the assembler nvptxas from there to get OpenAI’s Triton language vers. 2 working on Jetson Orin NX.

I have tried CUDA 11.8 but could not manage to use Pytorch on GPU. Have you @herr_dieter_graef succeed with upgrading to CUDA 12.0?

No i did not since i feared problems with all the other stuff i build from sources. I only used the assembler from Cuda 12. But since it is a release for Jetson it should work.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.