Hello,
I have followed all the instructions here: https://docs.nvidia.com/cuda/wsl-user-guide/index.html#unique_1238660826 to set up CUDA in WSL2.
However, the GPU still isn’t detected when running from WSL2. In the windows command line nvidia-smi works find and also torch.cuda.is_available() or tf.test.is_gpu_available() both return True. Inside WSL2 they return False (if I read it right nvidia-smi sin’t supposed to work there)
I have also tried the setup in a Docker container but I get exactly the container runtime initialization error that is mentioned at the bottom of the instructions (“Error response from daemon: OCI runtime create failed: …”).
I already spent many hours uninstalling, reinstalling & restarting everything but it doesn’t seem to help. Is there anything I’m missing?
Happy for any hints!
Here is the DxDiag file:
DxDiag.txt (106.8 KB)
Docker version 19.03.8, build afacb8b7f0
wsl -l -v:
NAME STATE VERSION
- Ubuntu Running 2
docker-desktop-data Running 2
docker-desktop Running 2
Windows OS Version: 10.0.19042 N/A Build 19042