Both the GTX 1050ti and GTX 1650 support CUDA, and either is new enough to be supported by TensorFlow. The 1050ti has compute capability (CC) 6.1 and the 1650 has CC 7.5. Tensorflow currently requires CC 3.5. If you are planning to run training (rather than just inference), you will want to make sure the frame buffer is large enough to support your models of interest.
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
Does GTX 1650 with Max-Q-Design (D4 GB GDDR5)or MX150 supports cuda for deep learing? | 2 | 4791 | January 17, 2020 | |
Entry level deep learning using laptop with GTX 1050 4GB or GTX 1650 4GB. | 0 | 1419 | January 8, 2020 | |
Does GTX 1660Ti support tensorflow-gpu | 0 | 3172 | September 5, 2021 | |
understanding the CUDA CUDA Toolkit Number and the CUDA toolkit version | 5 | 2691 | September 26, 2016 | |
CUDA for GeForce GTX 1050 Ti | 8 | 79668 | February 6, 2018 | |
Geforce GTX 1050ti compatibility | 1 | 664 | February 8, 2019 | |
Can i use RTX 2080 Super or GTX 1660 Ti for Deeplearning? | 0 | 905 | October 20, 2019 | |
CUDA-Enabled GeForce 1650? | 34 | 126697 | October 2, 2024 | |
Nvidia Geforce 1650Ti for deep learning | 0 | 760 | May 24, 2020 | |
cuda for geforce 1050 ti gddr5 notebook | 2 | 691 | March 17, 2019 |