Torch is not able to use CUDA/No CUDA GPUs are available

I’m using a GTX 1660 Super, Windows 10

So I’m trying to use a webui and I’m getting an issue with PyTorch and CUDA where it outputs

“C:\Users\Austin\stable-diffusion-webui\venv\Scripts\python.exe” -c “import torch; assert torch.cuda.is_available(), ‘Torch is not able to use GPU; add --skip-torch-cuda-test to COMMANDLINE_ARGS variable to disable this check’”

The -skip-torch-cuda-test command does let it run but then it’s of course forcing it to the CPU which defeats the point of CUDA. And then upon the app outputting the above error I get this in the Event Viewer.

Faulting application name: python.exe, version: 3.10.9150.1013, time stamp: 0x638fa05d
Faulting module name: nvcuda64.dll, version: 31.0.15.3129, time stamp: 0x640826be
Exception code: 0xc0000409
Fault offset: 0x000000000053c834
Faulting process id: 0x1448
Faulting application start time: 0x01d9592d82bf1bc7
Faulting application path: D:\Programs\ComfyUI\python_embeded\python.exe
Faulting module path: C:\WINDOWS\system32\DriverStore\FileRepository\nv_dispi.inf_amd64_059948e396d205d5\nvcuda64.dll
Report Id: 1fec2025-d92a-4707-80bb-25b17677dfd5
Faulting package full name:
Faulting package-relative application ID:

Which seems to be pointing to nvcuda64.dll, version: 31.0.15.3129 being the problem. In an attempt at a solution I deleted that version of nvcuda64.dll and downgraded my graphics drivers to an older version that was in system32 nvcuda64.dll, version 31.0.15.2756 and the webui actually opens. But upon trying to run the program it outputs all this mess saying no CUDA GPUs are available.

D:\Programs\ComfyUI\python_embeded\lib\site-packages\safetensors\torch.py:99: UserWarning: TypedStorage is deprecated. It will be removed in the future and UntypedStorage will be the only storage class. This should only matter to you if you are using storages directly. To access UntypedStorage directly, use tensor.untyped_storage() instead of tensor.storage()
with safe_open(filename, framework=“pt”, device=device) as f:
D:\Programs\ComfyUI\python_embeded\lib\site-packages\torch_utils.py:776: UserWarning: TypedStorage is deprecated. It will be removed in the future and UntypedStorage will be the only storage class. This should only matter to you if you are using storages directly. To access UntypedStorage directly, use tensor.untyped_storage() instead of tensor.storage()
return self.fget.get (instance, owner)()
D:\Programs\ComfyUI\python_embeded\lib\site-packages\torch\storage.py:899: UserWarning: TypedStorage is deprecated. It will be removed in the future and UntypedStorage will be the only storage class. This should only matter to you if you are using storages directly. To access UntypedStorage directly, use tensor.untyped_storage() instead of tensor.storage()
storage = cls(wrap_storage=untyped_storage)
Traceback (most recent call last):
File “D:\Programs\ComfyUI\ComfyUI\execution.py”, line 174, in execute
executed += recursive_execute(self.server, prompt, self.outputs, x, extra_data)
File “D:\Programs\ComfyUI\ComfyUI\execution.py”, line 54, in recursive_execute
executed += recursive_execute(server, prompt, outputs, input_unique_id, extra_data)
File “D:\Programs\ComfyUI\ComfyUI\execution.py”, line 54, in recursive_execute
executed += recursive_execute(server, prompt, outputs, input_unique_id, extra_data)
File “D:\Programs\ComfyUI\ComfyUI\execution.py”, line 54, in recursive_execute
executed += recursive_execute(server, prompt, outputs, input_unique_id, extra_data)
[Previous line repeated 1 more time]
File “D:\Programs\ComfyUI\ComfyUI\execution.py”, line 63, in recursive_execute
outputs[unique_id] = getattr(obj, obj.FUNCTION)(**input_data_all)
File “D:\Programs\ComfyUI\ComfyUI\nodes.py”, line 244, in load_checkpoint
out = comfy.sd.load_checkpoint_guess_config(ckpt_path, output_vae=True, output_clip=True, embedding_directory=CheckpointLoader.embedding_directory)
File “D:\Programs\ComfyUI\ComfyUI\comfy\sd.py”, line 776, in load_checkpoint_guess_config
fp16 = model_management.should_use_fp16()
File “D:\Programs\ComfyUI\ComfyUI\comfy\model_management.py”, line 226, in should_use_fp16
if torch.cuda.is_bf16_supported():
File “D:\Programs\ComfyUI\python_embeded\lib\site-packages\torch\cuda_init .py", line 122, in is_bf16_supported
return torch.cuda.get_device_properties(torch.cuda.current_device()).major >= 8 and cuda_maj_decide
File "D:\Programs\ComfyUI\python_embeded\lib\site-packages\torch\cuda
init .py", line 674, in current_device
lazy_init()
File "D:\Programs\ComfyUI\python_embeded\lib\site-packages\torch\cuda
init
.py”, line 247, in _lazy_init
torch._C._cuda_init()
RuntimeError: No CUDA GPUs are available

I’m still pretty new to all this and tried the few things I found on Google but I’m just completely lost at this point.