CUDA for RTX3060

Hi Everyone,

I just bought a new Notebook with RTX 3060. I always used Colab and Kaggle but now I would like to train and run my models on my notebook without limitations.

I just looked at CUDA GPUs | NVIDIA Developer and it seems that my RTX is not supported by CUDA, but I also looked at this topic CUDA Out of Memory on RTX 3060 with TF/Pytorch and it seems that someone is working with CUDA on this kind of GPU.

I installed CUDA 11.3.58 but none of my models run. How can I fix/set up my CUDA?

Thanks in advance
Luigi

Hi @maieseluigi, first of all, please describe “none of my models run” in more detail, it will be helpful to narrow down the issue.

Anyway, here is just my summary/suggestion for ya:

Summary
I am currently using r5-2600 with rtx3060 12Gb. My previous issue was due to the lack of ddr4-RAM memory.

The solution is to make sure that your RAM is at least twice as big as your GPU’s VRAM. In simple, you just need to buy more RAM~

FYI, I use CUDA 11.2 and Pytorch 1.8.

Suggestion:
I suggest that you double check that your GPU is recognized by your OS, in windows, just go to device manager->display adapter.

If it is, then you could try to test it with CUDA samples to check that your GPU is recognized by CUDA software and works normally.

If it is also working, then it might be your pytorch version, or some dependency issues, or a potential bug in Pytorch, you might gonna need to put a new thread on the Pytorch discussion at this point.

Hi @maieseluigi,

This doesn’t look like TensorRT related issue. We recommend you to post your concern on related platform.

Thank you.