I have a workstation (MB: Asus Z10PE-D8) with 2 Titan X GPUs (Maxwell + Pascal). MB supports 2 CPUs, but I don’t want to buy another one, and PCI-E layout doesn’t allow me to have a space between the cards.
I’m going to do a lot of neural networks training (potentially weeks of non-interrupted computation) and possibly expand to 4 GPUs.
Do you think there is / will be a cooling problem? (space btw cards is ~1 mm)
What’s the best cooling solution you can think of given the situations?
Put the cards in PCIE slots 0 and 4, which both have PCIE 3.0 x16 bandwidth on that motherboard. That provides lots of space between the cards.
A choice of a case with good airflow is helpful too.
If you expand to 4 GPUs, then it does get trickier. I use Silverstone FT02 cases which are probably optimal for air cooling quad GPUs since it rotates the motherboard for vertical air flow, and uses multiple 180mm intake fans to pressurize the case with filtered air, blowing directly on the GPUs.
Best cooling is of course to use liquid radiator loops using aftermarket GPU shroud replacements. This is not only effective, but also does not degrade its effectiveness with dense packed GPUs, unlike air.