Cuda error on jetson orin

I was running a deep learning program on my jetson orin and I suddenly got this error.

UserWarning: CUDA initialization: Unexpected error from cudaGetDeviceCount(). Did you run some cuda functions before calling NumCudaDevices() that might have already set an error? Error 804: forward compatibility was attempted on non supported HW (Triggered internally at /opt/pytorch/pytorch/c10/cuda/CUDAFunctions.cpp:108.)

Earlier when I just created this conda environment, torch.cuda.is_available() returned true and I could use CUDA & GPU.What should I do? cuda and torch installations have been tested without issues.(Guess it was an accidental upgrade)

Your topic was posted in the wrong category. I am moving this to the Jetson AGX Orin forum for visibility.



Could you share the environment details with us?
Which JetPack do you use? And how do you install the PyTorch package?


jetpack version:5.1-b147 cuda:11.4 cudnn: torch:2.0.0+nv23.5 torchvision:0.15.2 I downloaded the torch from this website :PyTorch for Jetson. There should be no problem with all this. Because I’ve run the algorithm with this environment before. Then all of a sudden cuda shows this error. I guess it may be caused by an update or upgrade. But I don’t know what to do


Do you get the version on the current platform?
Or it is the version you installed before?

It would be good if you could check the latest version on your platform to see if any version changes.

I’m sorry, I don’t understand. What version are you talking about? Are you referring to this information? # R35 (release), REVISION: 2.1, GCID: 32413640, BOARD: t186ref, EABI: aarch64, DATE: Tue Jan 24 23:38:33 UTC 2023. My jetpack should be the latest version

Based on this, do you know if there is any CUDA version difference in the working/failure environment?
Or does it look the same?

More, the message is a warning rather than a fatal error.
Are you able to run the program even though it reports the message?


This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.