I have to decide whether to purchase a Tesla K40 or a Tesla M40 for both general large-scale machine learning and deep learning specifically. I’m leaning towards the M40 for its greater FP32 throughput, but the M40 is passively cooled, while the K40 is actively cooled. If I do purchase the M40, I’d like some suggestions for cooling it, especially since it will be installed in my workstation at home rather than a 1U server.
My workstation specs:
Phantom NZXT tower case
Gigabyte Z77X-UD5H motherboard
GTX 780 GPU connected to the display
Intel i7-3770 CPU @ 3.5 GHz w/ Cooler Master Hyper 212 Evo cooler
Cooler Master V1000 1000 W, Gold-rated power supply
Thanks for any suggestions or feedback. If you think the K40 might be better than the M40, please feel free to speak up.
Using passively-cooled GPUs outside of server enclosures is problematic, we have had more than a handful of requests for assistance in these forums from people who attempted such configurations and failed (GPU overheated and shut down to protect itself). Have you looked into using a Titan X ? What is the expected usage pattern of the GPU?
Unless you know that the only thing you care about is FP32 codes, I personally would favor the K40c as being more general purpose, but that is a minor distinction compared to the trouble you will have with an M40 installed in an improper platform.
Thank you everyone for your help. The primary use for the card will be deep learning, and large-scale machine learning and optimization in general. I think I will go with a K40c rather than a Titan X. The Titan X has better FP32, but the K40c has ECC memory and better FP64 if I ever need that.