NVIDIA M2090 power compatibility question

Hi all,

I’m new to the CUDA world. I bought a used Nvidia Tesla M2090 from Ebay and plan to install it in a HP XW8600 ( http://www8.hp.com/h20195/v2/getpdf.aspx/c04136962.pdf?ver=18 ) workstation which comes with a 800W power supply.

The M2090 seems to draw < 225W but I’m not sure if 800W is enough for that beast. Also from a cooling point of view, is the XW8600 a non-starter to begin with?

Thanks in advance.


Personally, I would say that from a cooling point of view, the XW8600 is a non-starter.

These passively cooled Tesla modules really need to be installed in a server that is designed to handle them.

Anything else is just asking for trouble. You can find plenty of descriptions of that trouble on this forum if you look.

I used to operate a xw8600 with a C2050 (actively cooled) and a low-end Quadro for graphics. In practical terms, the issue was that the power supply in the xw8600 didn’t even offer an 8-pin PCIe power connector, I had to run it out of spec by using a 6-pin to 8-pin converter. The 6-pin strands are officially rated for only 75W, while the 8-pin strands are rated for 150W. That said, this worked fine for many years. Cooling was adequate to keep the C2050 going at full tilt, but the GPU did get very hot when running continuously at full load.

As txbob points out,any passively cooled GPU such as an M2090 is an absolute non-starter in that workstation enclosure, there is no way a passively cooled GPU designed to draw close to 300W (slot: 75W + 6-pin: 75 watt + 8-pin: 150 W) at peak can be cooled adequately. The GPU will shut down pretty much immediately due to tripping the thermal limit. Maybe you can exchange your M2090 for a C2070 or C2050 (actively cooled) somewhere.

Thanks for the replies @txbob and @njuffa

I looked at the C2050s but they were a bit pricey (in the $400 range).

What would be a consumer class GPU card that would have compute power comparable to the M2090 or C2050?

Would the GeForce GTX 750Ti even get close?

or can you recommend any other GPU cards for the xw8600?

I want to use the GPU for neural network simulations… so my application is all academic research

Thanks again

For neural networks, I would at least want to have the capability to use cuDNN library, not because I would plan to use it myself, but because I might want to use a framework (caffe, torch, theano) that used it.

cuDNN requires a cc3.0 or higher GPU. Those Fermi-class GPUs (205x/207x/2090) won’t work with that library.

GTX 750Ti would work. Beyond that for neural networks I would want to maximize memory bandwidth and single-precision compute throughput within my budget. Sounds like your budget is less than $400, I would suggest GTX970, or if $350 is too much, then GTX960 ($200)

I can’t speak to the fit in a xw8600. I just recently tried to put a GTX960 in an old T3500 workstation and ran into mechanical interference issues with some of the internal sheet metal.

Excellent point about checking dimensions. As I recall, the C2050, with a length of 9.75", barely fit into the xw8600 on account of the drive bays. I see that the length of the GTX 970 is 10.5" [url]http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-970/specifications[/url]. So measure twice before ordering up a GTX 970.