Usage of GeForce GPU for ML

Can the GPU (in the GeForce & the like GPU cards) in the PC be used to run machine learning algorithms? If yes, how do you do that? My assumption is typically, the IDE (like Spyder) used to develop your machine learning code run by default only on the CPU. However, this is slow for obvious reasons & my interest is to move this execution to the GPU. Pls note that the idea here is to leverage the Graphics card that comes with the PC & not to buy another expensive accelerator card.


Yes, absolutely GeForce GPUs can be used for DL/ML development. Depending on the memory size and the type of GPU, you may want to consider NGC docker containers as your development environment, which eliminates many of the dependencies and installation issues associated with DL/ML frameworks and libraries.

In short, yes, you can leverage NVIDIA GTX/RTX for compute!