I found that even if I don’t download or install any CUDA, cuDNN, etc, when I opened WSL 2 (e.x. Ubuntu 22.04 LTS) terminial in Windows Terminal, the command
nvidia-smi still has normal output, see:
So I’m wondering if I can simply consider it as an normal Ubuntu 22.04 LTS. After installing CUDA, cuDNN, etc. on it, then directly use GPU for my project as usual? I’m quite not sure if this is feasible…
If everything is up to date, you should be able to use CUDA. See this MS docs page: Enable NVIDIA CUDA on WSL 2 | Microsoft Docs
However, if you want to use ML libraries you will be restricted. I’m not sure about using CuDNN directly, but for things like TensorFlow and ONNX you will be limited in the version of these libraries that you can use GPU acceleration with.
See MS docs: GPU acceleration in WSL | Microsoft Docs
For example, if you look at the PyTorch DirectML (directML is what lets WSL use GPU for inference/training) it is limited to python 3.8 (and all the limitations that comes with that): DirectML/PyTorch at master · microsoft/DirectML · GitHub