8GPUs in 4U Server What about heat?


has anybody experience with this (or a similar) configuration:


I wonder whether this allows for safe working conditions considering the limited airflow to the fans of the GPUs.

Best regards

Even though there are 8 GPUs in that computer, they are Tesla C1060s. They will generate only 50% more heat than a quad GTX 295 configuration. It’s hard to tell if the three 120mm fans that far from the backs of the cards will force sufficient air through the system, though.

I would ask the manufacturer if they have temperature measurements from the on-card sensors when this system is under full load. I would be curious what the temperatures of the middle two cards are. :)

We have a bunch of servers with 6 C2050 cards on each of them. Well, it is kind of hot when full load, but none of them has gone down due to overheating. My guess is that NVIDIA designed the cards to work at extremely high temperature。

However, it may be a problem when you have hundreds of these beasts, as is in our case. Our UPS did go out several times when we try to run the 2,000 C2050 on full load. External Image

Can you spare one of those for us? ;)

Whoa! Are you allowed to tell us what system has 2,000 C2050s in it, and what they are being used for??

Ah, THIS is where the shortage of Fermi’s comes from. ;)

500 kW of power (assuming 250 Watts each), 4 million Euros in costs for GPUs alone (assuming retail price for the C2050). Eww…

I don’t think it is confidential. They held a power up ceremony several days ago, and it was reported on TV. The machine is located in a research center. I think it will be used in research for new materials. But I am not sure. I am here only to test the performance of the machine.

In my opinion, putting so many Fermi cards in one box is not a good idea. The PCI-E is easily saturated when all of the cards works on full load. Maybe two Fermi for each box is a better choice.

I also faced a problem with this system, does anybody know it wroks(All 8 GPU work) with windows 7 64bit?