Hi, I build an image with nvcr.io/nvidia/cuda:11.8.0-devel-ubuntu22.04 and install python3.11.0, pytorch. But use torch.cuda.is_available() got an error:
UserWarning: CUDA initialization: Unexpected error from cudaGetDeviceCount(). Did you run some cuda functions before calling NumCudaDevices() that might have already set an error? Error 804: forward compatibility was attempted on non supported HW (Triggered internally at …/c10/cuda/CUDAFunctions.cpp:108.)
return torch._C._cuda_getDeviceCount() > 0
My server env:
nvidia-smi: 470.57.02
cuda: 11.8
GPU: 2080ti
In cuda document shows cuda11.8 need driver version>450. Why I can not use torch in my container?