Rtx 3060 can support cuda with pytorch

Hi,
I want to buy a Rtx 3060 for my computer. I read the Cuda enable GPUs docuent on Nvidia Developers but it seems that Rtx 3060 is only supported on mobile environment. I want to use on desktop. Is the document not updated or is it not available rtx 3060 on desktop?
Have a nice day.

2 Likes

This is a relevant question that I didn’t think I needed to check before buying GeForce RTX 3060 :').

I researched a lot (after having the new machine, of course) on how to use PyTorch with a RTX 3060 card, specially with older versions or torch (0.4.0) and torchvision (0.2.1), but no luck with that. RTX 3060 and these packages apparently doesn’t have compatibility with the same versions of CUDA and cuDNN. I tried to do this by using different combinations with compiled versions available in conda, but didn’t work, maybe it could work if you recompile from source these packages.

After all this, actually I was able to use RTX 3060 effectively with latest versions of all these dependencies with two methods:

  1. Using a conda env, and latest versions published in pytorch site (Start Locally | PyTorch):
conda create -n rtx_3060 python=3.6.5
conda activate rtx_3060
conda install pytorch torchvision torchaudio cudatoolkit=11.1 -c pytorch -c nvidia
  1. Using LambdaLabs (Install TensorFlow & PyTorch for the RTX 3090, 3080, 3070):
LAMBDA_REPO=$(mktemp) && \
wget -O${LAMBDA_REPO} https://lambdalabs.com/static/misc/lambda-stack-repo.deb && \
sudo dpkg -i ${LAMBDA_REPO} && rm -f ${LAMBDA_REPO} && \
sudo apt-get update && sudo apt-get install -y lambda-stack-cuda

Reboot your machine.

LAMBDA_REPO=$(mktemp) && \
wget -O${LAMBDA_REPO} https://lambdalabs.com/static/misc/lambda-stack-repo.deb && \
sudo dpkg -i ${LAMBDA_REPO} && rm -f ${LAMBDA_REPO} && \
sudo apt-get update && \
sudo apt-get --yes upgrade && \
sudo apt-get install --yes --no-install-recommends lambda-server && \
sudo apt-get install --yes --no-install-recommends nvidia-headless-455 && \
sudo apt-get install --yes --no-install-recommends lambda-stack-cuda

Reboot your machine.

*** The good thing about this method is that if you have your current environment messed up, lambda-stack is going to actually fix it.

Cheers.

Update: Gist created because of this post PyTorch on RTX 3060 · GitHub

1 Like

Thank you for your help. I also purchased Rtx3060 and downloaded the latest versions of torch and cuda, it worked without any problems.

2 Likes

Hi.

Why does conda install cudatoolkit=11.4 not work?

Hi.
The cudatoolkit you downloaded from nvidia and the pytorch or tensorflow versions must be compatible. I use cuda with 3060 on image processing. 3060 performance satisfied me.

Will installation of pytorch for cuda11.1 work in 11.4?

I dont know but I use 11.3 with pytorch 1.8. Like that.
Screenshot_214

What is the benefit of using LambdaLabs?

The good thing about this method is that if you have your current environment messed up, lambda-stack is going to actually fix it.

Would this work with CUDA 10.2 and PyTorch 1.7.0?

Hello, this forum is dedicated to discussions related to using the sanitizer tools and API.
Questions related to CUDA can be raised at CUDA - NVIDIA Developer Forums