Multi GPU Power Supply Questions

Hi,

I am putting together a quiet system for my research (quiet = watercooled); mostly Monte Carlo simulations. I am no where near finished porting my code. I would like to utilise multiple GPUs. So far I have two water cooled 9800GX2 on my workbench setup. I would like to put everything in the case and am wondering about additional cards. I have two open PCIe (x8) slots and a flex cable thingy so I could add one or two more cards. I’ve searched for power specs on these cards but found nothing. A little experimenting gives:

One 9800GX2 at system idle consumes 90 watts air cooled or 67 watts water cooled (no fan load).

Running the SDK nbody program adds 63 watts at 258 GFLOP/sec; this uses only one of the two G92’s. So a similarly active program using both G92’s would consume 67+63+63=193 watts.

My power supply is rated 1250 watts continuous and 1375 watts peak.

My system with an old VGA card consumes 310 watts with a heavy computational load. So, other things being equal, I could run a similarly active program on 4 cards at 310+193*4=1082 watts. But, other thing are never equal!

First, I am only measuring average power. Transient peaks can be much higher.

Second, how close is the nbody program to “full load”? Does power scale linearly with GFLOP’s?

I don’t want to do this assembly twice. I have to build a substructure to support the weight of the WC cards in a horizontal position and if I buy too many cards, I can’t really return them after modifying them.

An alternative plan is to have the two 9800GX2’s on the x16 slots and two 8800GTS(G92) on the 8x slots; sort of balance the memory bandwidth load.
Another other plan is for 2 x 9800GX2 and 2 x 280 cards (they seem to be dropping in price). Are there potential problems with this mix of cards?

Does anyone have power load data on these cards; either factory specs or personal experience?

Does anyone know of a free or open source program that would give a maximum or near maximum load on these cards?

I realize that maximum utilization of these cards is difficult to achieve but I would rather be prepared for near maximum loads.

Any advice appreciated, Liz

All these cards are pretty much designed to max out the 2x6-pin connectors, which are rated for 225W.

Re your power supply: It’s not a matter of simply adding up watts and seeing if they equal 1350. Look in the PSU’s documentation on info regarding rails. You can put x amount of amps through a single rail, and in practice it really cuts down on the total cards you can hook up. I don’t think any PSU will support 4. I’d recommend buying an auxiliary PSU. (Search “GPU PSU” on newegg.)

I’m not sure how well your flex cable will work. It’s probably designed for 2.5GHz operation and PCIe 2.0 use 5GHz. In general, the PCIe bus is already at the limits of signal quality, and a flex cable sounds really dangerous. Make sure to test it thoroughly. Instead of doing the flex cable thing, I’d suggest just buying a motherboard with four PCIex16 slots. There’s a couple on newegg for $150-$200.

There is another active thread started by a person who wants to build a 4-GPU rig. Look into it, there’s a lot of advice.

Thanks Alex,

  1. Do you happen to know the rating on the 8-pin PCIe power connector?

  2. I am looking into the PSU rails, good advice.

  3. I have a MB with 4 PCIe slots, not enough space between slot3 and slot4. My board is only PCIe 1.0. Already tested the cable, no difference in performance. It is a ribbon cable about 4 inches long, shielded with foil tape and appropriate connectors on each end.

  4. I’ve looked at the other thread, also the Fastra website. I wanted to get a 1500 watt PSU but could only find one for 230volt. I only have 120 volt, 20 amp outlets (unless I put it in the kitchen). 1500 watts at 110volt, 80% efficiency, 90% power factor is less than 19 amps but nobody seems to make them.

Liz

The rating of the connector itself doesn’t matter much for the purpose of hacking. (There’s limits on when the wires start to heat up. For commercial designs each wire pair is rated for something like 25W, but that’s very conservative. It could easily conduct two times as much.) Just find out the ratings on the the rail that feeds it.

If the cable doesn’t affect performance, I guess it means there’s not a lot of packet-retries on the bus so the signal quality is ok. I dunno, I guess it’s alright, although it must still be a lot of work mounting the card externally.

P.S. If you’ve coded your algorithm to use multiple GPUs efficiently, the 9800 GX2 should give the best performance vs 8800 or even GTX280. Also having identical cards will probably make your algorithm run faster (due to more efficient load balancing).

P.P.S. I’ve found that the cards are very quiet if they’re fed cool air. The loudest component in my system is the PSU. Are you sure water cooling is worth the hassle? And that you have enough water pressure?

Actually read the manual for my PSU (coolermaster 1250); there are three 12 volt rails devoted to the PCIe cables. Each one is rated at 28 amps peak or 25.45 amps continuous. Each one has a combo 8 + 6 pin cable plus an additional 6 pin. If I distribute four cards over the three rails, two of them will have a maximum of 18.67 amps peak or 17 amps continuous which is 204 watts continuous. My estimate for a program comparable to nbody running on two GPU chips was 193 watts. So this PSU is definetly marginal for 4 x 9800GX2 with a heavy computational load. What is likely to happen with an overload? System shutdown, PSU damage or GPU damage?

Koolance makes a 1300 watt PSU (running on 120 volt) with 4 rails: 20, 20, 40, 40 and eight PCIe connectors, four six pin and four 8 pin. I am trying to find out how they are distributed over the rails. REAL expensive US$600, and watercooled.

RE: water cooling. I have two Liang DDC pumps with two seperate water loops. Only one loop is running with one CPU waterblock (DangerDen) and two GPU waterblocks (Koolance) flow is 2.1 gallons per minute. Under load, GPU temperature is 40 degrees C versus 73 degrees C air cooled. Installing the GPU waterblocks IS a hassle, Very tedious, first one took four hours, second one 3 hours.

Liz

As for your idea of mixing some 9800 and 8800s as an alternative…

Computationally, you’d be better off with 2 GTX280s. One powerful GPU is a lot easier to deal with than 2 less powerful ones.
Lots of advantages to GT280, even if you don’t need them all, like double support, PCIe2.0 bandwidth, better coalescing, bigger RAM, etc etc etc.

Plus you can always add more GPUs later, if you start with 4 lower ones you’re already maxed out.

PSU damage, then random shutdowns. GPU/everything else should be alright.

The last time I looked at watercooling, what scared me off was the life expectency of the pumps.

If it’s a decent PSU, it will just trip a breaker and shut off without damaging anything.