Cannot use pytorch in cuda container, error code 804

Hi, I build an image with and install python3.11.0, pytorch. But use torch.cuda.is_available() got an error:
UserWarning: CUDA initialization: Unexpected error from cudaGetDeviceCount(). Did you run some cuda functions before calling NumCudaDevices() that might have already set an error? Error 804: forward compatibility was attempted on non supported HW (Triggered internally at …/c10/cuda/CUDAFunctions.cpp:108.)
return torch._C._cuda_getDeviceCount() > 0

My server env:
nvidia-smi: 470.57.02
cuda: 11.8
GPU: 2080ti

In cuda document shows cuda11.8 need driver version>450. Why I can not use torch in my container?

The container is set up to attempt forward compatibility if necessary. That failed because you are running on a GeForce GPU. If you want to use that container, on that GPU, I suggest upgrading the driver on your machine to one that has native support for CUDA 11.8, i.e. R520 or newer. I generally would recommend just installing the latest driver.