CUDA-Enabled GeForce 1650?

Hi all,

I would like to know if GeForce 1650 is CUDA-Enabled.

In this link [url]https://developer.nvidia.com/cuda-gpus[/url] it doesn’t appear but could someone confirm me?

Thank you very much.

1 Like

Every GPU produced by NVIDIA since about 2008 is CUDA enabled.

1 Like

While trying to run gromacs on my laptop, the software states that my GTX 1650 is not enabled for computing. nvidia-smi shows the graphics card and the drivers are installed.

Install CUDA. When you install CUDA, select the option to keep your current driver version.(*)

CUDA installers can be gotten here: [url]http://www.nvidia.com/getcuda[/url]

Follow the install guide instructions (linked from that page) for your OS carefully.

If GROMACS has specific CUDA version requirements, you will need to check with GROMACS community or rebuild your GROMACS to match your installed CUDA version.

(*) (Note for future readers: this doesn’t necessarily apply to you. Right at the moment, GTX 1650 is a very new GPU, and so any driver that works with GTX 1650 will work with any currently available CUDA toolkit version. This doesn’t apply to every GPU and every CUDA version, and may no longer be valid months or years into the future.)

If you cannot find the answer in the GROMACS documentation, I would suggest asking about GROMACS configuration issues on the official GROMACS mailing list:

[url]http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List[/url]

Hi, What is the compute capability of the GTX 1650? I don’t see it here https://developer.nvidia.com/cuda-gpus#compute

All Turing family GPUs currently are compute capability 7.5

https://www.techpowerup.com/gpu-specs/geforce-gtx-1650.c3366

Hi, the compute capability of GTX 1650 is not mentioned. I am a AI researcher and I want to know if this is possible to use this gpu card for training the neural networks or not?

2 Likes

The compute capability is 7.5. It is possible to use it for training neural networks. I won’t be providing tutorials for that here. The topic is covered elsewhere on these forums and other NVIDIA websites, as well as elsewhere on the web.

1 Like

The GTX 1650 is based on the Turing architecture (specifically, the TU116 GPU) with compute capability 7.5. The minimum requirement for various deep-learning frameworks at this time is typically compute capability 3.5. So yes, you should be able to run those frameworks.

You would want to carefully check the hardware requirements of the particular deep-learning frameworks and applications you intend to run.

Hi, the compute capability of GTX 1650Ti is not mentioned anywhere. Is this card can use for training of neural network or not?

Thank you.

1 Like

I assume this is actually a GTX 1650Ti Mobile? If so, that is also based on the Turing architecture (specifically, a TU11x chip) with compute capability 7.5, just like the GTX 1650 discussed above.

Again, you would want to carefully check the hardware requirements of the particular deep-learning frameworks and applications you intend to run.

So while according to the GTX 1650 specs, it has CUDA cores, why is it not on the list? CUDA GPUs - Compute Capability | NVIDIA Developer

5 Likes

Hi !!! Recently I have purchased NVIDIA GeForce GTX 1650 Ti GPU-enabled Laptop. Please suggest to me the CUDA and cuDNN version that will compatible with the GPU, So that I can work with deep learning libraries like Tensorflow.

1 Like

I assume this is a GeForce GTX 1650 Ti Mobile, which is based on the Turing architecture, with compute capability 7.5 (sm_75). Any CUDA version from 10.0 to the most recent one (11.2.2) will work with this GPU. However, various components of the software stack used in deep learning may support only very specific versions of CUDA. Check the documentation of those software components for their respective requirements.

Best I can tell from the Tensorflow GPU requirements page, it seems that Tensorflow >= 2.4.0 requires CUDA 11 and CUDNN 8.0.4. I do not use Tensorflow so I cannot confirm this reading of the Tensorflow docs. You may want to find an authoritative answer in the Tensorflow support channels.

Hi,

I have a GPU NVIDIA GeForce GTX 1650Ti, driver version 461.09, and I am working in Windows 10x64. Is my GPU cuda-enabled?

Further, I would like to run a python script in parallel for data feature extraction, but I found out confusing information about it. Can you help me with this? Is there any documentation that may help me?

Best,

Same question

If the question is “Is the GPU NVIDIA GeForce GTX 1650Ti CUDA enabled” the answer is: Yes. Every GPU NVDIA has introduced for the past dozen years or so is CUDA enabled. If your question is about something else, please state the question.

1 Like

The original question is whether the GPU NVIDIA GeForce GTX 1650 is CUDA compatible, not the GTX 1650 Ti version. You state that every NVIDIA graphics card released in the last dozen years is CUDA enabled, but the GTX 1650 version does not appear in the official CUDA compatible GPUs despite the fact it was released in 2019. My question is: will the GTX 1650 will be CUDA compatible in the near future or will it not be supported at all? Thank you in advance.

2 Likes

@jordi.pomada The original question about the GTX 1650 was answered by Robert Crovella in September 2019 (see start of the thread).

I, on the other hand, was responding to a question from November 2021 that did not specify the graphics card, but which was directly following a question from May 2021 about the GTX 1650 Ti. When people refer to the “same”, it is reasonable to assume that they a referring to the closest applicable context provided by the thread (which in this case was GTX 1650Ti).

Manually maintained lists can contain bugs. One class of bugs is omissions.