CUDA Rig Any experience/ideas about Thermaltake SwordM


I have a question for people, who have experience with high-power boxes for CUDA computations.

It looks like I’ll be building a new box for CUDA simulations in the near future. I would like to have the ability to plug in 4 double-slot GPUs into that box, although I’ll start with just one or two, most likely. I intend to be running a GPU-heavy CUDA app essentially around the clock on this box. It will NOT be convenient for me to have a rack-mountable box.

I just received an email discount offer to buy a new ThermalTake SwordM box for $170, shipped free, good through Memorial Day Weekend directly from the vendor. This looks like a good price for this box: it’s currently priced at $300+$30 shipping at newegg’s. The boxes don’t seem to depreciate very quickly, so even if my project gets delayed by a couple of months, the money I payed for the box won’t go down the drain, correct?

According to the vendor’s specs

the box supports 7 expansion slots and quite a few colling options.

Hence my questions:

  1. Do you think I would be able to squeeze 4 double-slot CUDA cards, such as Fermi into this box (note: it supports only 7 expansion cards)?

  2. Do you have any concerns regarding the cooling options offered by this box?

  3. There are 6 drive bays in the box. Do you think I would be able to squeeze in a second power supply, if so required to power my cards?

Obviously, any other thoughts would be greatly appreciated.


1). No, this case won’t support 4 double-wide cards. For that you need either an 8 or 10 slot case, depending on the motherboard. (most of the latest batch of modern motherboards are larger and need 10 slots just because of the spacing of the PCIE further away from the CPU)

  1. Not having even seen the case except in the pics, it’s hard to say. But likely it would be fine just by looking at all of those fans!
  2. There are 5.25" drive bay sized PSUs which can work well to power GPUs. They’re loud (for their size) but I’ve used them successfully in the past. But I wouldn’t recommend them. Just get a single 1500 watt PSU and you’ll likely be OK. Even if you’re using 4 GTX480s, the power use for the GPUs alone will likely be under 1000 watts for CUDA apps. (graphics apps would use more wattage though). It’s possible 1500 watts wouldn’t be enough if you’re using dual CPU though.

If I were you, I’d probably buy the case for a 3-GPU build… that’s a good price for it.

I have limited experience with actual builds, as I’m in a similar position to you on this. I’ve been researching and reading up on CUDA machines for a few months now. My replies:

  1. No. One option is to get the single width GTX470 from (I think) Galaxy for the 7th slot. Otherwise, go with something like the Antec 902, which has 8 slots and is currently $120 at NewEgg.

  2. Yes. I don’t know if ANY box could properly air cool 4 GTX 480 boards crammed together side by side in 8 slots. But otherwise, it looks like you could put enough fans on it.

  3. Maybe. And you either will get a 1200 watt supply, or need two supplies. Even with a big one, you have to have enough PCIe cables to drive the cards, you might need splitters, etc.

There is a sweet spot at 2 or 3 CUDA cards per box. Going to 4 requires a lot of extra trouble and expense in picking out parts to support everything. Further, you need to know what is your bottleneck. If you’re sending data back and forth to the card all the time, then having more than one card in the box will slow you down a lot. If memory is your limitation, that puts other constraints on it. But if you really can send off some data to a card and let it crunch it for a long time, then 4 in 1 box might work for you. But even with 3, you have to make sure your power supply will provide what each card needs - a 700W supply with 4 12V rails won’t have quite enough power on one rail to drive a GTX 480 or GTX 295.

Other issues - do you need video, or will you run it headless? If you need video, either get a mobo with onboard video, or make sure you have a slot for a smaller video card.

A quick recommendation:

MSI NF980-G65 AM3 NVIDIA nForce 980a SLI HDMI ATX AMD Motherboard (3 PCIe 2.0 slots, either in x16/x16/x0 or x16/x8/x8, and onboard nVIDIA video.)

AMD Phenom II X2 555 Black Edition Callisto 3.2GHz 2 x 512KB L2 Cache 6MB L3 Cache Socket AM3 80W Dual-Core Desktop Processor - C3 Revision (great bang/$$$ and you can possibly unlock 1-2 more cores, or just spring for another $50 and get the 4 cores on version. Of course, CUDA boxes aren’t about the processor. The 3 core AthlonII would probably do fine.)

Antec gaming case (three hundred illusion, six hundred, nine hundred, or twelve hundred) - they all have top fans, and the twelve hundred can take the excellent and inexpensive CP-1000 power supply.) Other cases would also work, but look at gaming cases with max ventilation.

900-1100 Watt power supply, insuring that each 12 V rail won’t underpower a CUDA card.

Memory, at least as much as the cards have (in total).

Hard drives (recommend a 2-3 disk RAID0 configuration, so you can buy smaller disks and get faster speeds.)

A more serious recommendation:

For your first build, target a setup that will hold 1-2 CUDA cards. Everything will be easier to get, cheaper, you’ll have more choices, and you’ll learn what to do and not to do when designing a bigger box. And if you outgrow it, either sell it on eBay, or use it for benching new code and testing stuff.

Best of luck.



Thanks a lot for your ideas. It sounds like you already saved me from frustration: I’m probably not getting this box.

Martin: if you end up building a CUDA box, PLEASE post the specs of the final build with as much detail as possible.

Nvidia: it would be A HUGE HELP for us, CUDA enthusiasts, if you could maintain some kind of a “recommended hardware” database. Those systems do NOT have to be certified or guaranteed to work, the data can be community-supported, and any help would be much appreciated. I do not understand why you spend so much effort building us the number crunchers and supporting us on the software side, but provide no support whatsoever (or at least I’m not aware of any) as far as system engineering is concerned. “Go buy a Tesla system” response won’t satisfy many of us.


What about Tyan S7025??? It is Tesla certified.

Is this in response to my post? In that case, I was obviously not clear enough to state, that my rig won’t be based on Tesla cards.

But I suppose you can put your geforce cards there too

I am not sure if a geforce card is bigger than a tesla card though.

Thanks, this is what I meant, but was not aware of.

They might want to add Thermaltake Xaser VI (10 expansion slots) to the cases section.

Any pointers to a similar page based on non-Tesla cards?

Regular GPUs are pretty much the same as Teslas in terms of the build choices like motherboard, case, memory, etc. You do need to watch out for PSU since a GTX480 can use more watts than a S2050 for example, but that’s all. Budget 250-300 watts per GTX480 and you’ll be fine… In practice they’ll draw about 200-230 watts in CUDA apps, they only reach the peak 300 watts with graphics. Teslas are cooler and lower wattage which helps when building dense clusters.

Martin’s post earlier in this thread is excellent. A 3 GPU machine is cheap and straightforward… going to 4 GPU really increases the build complexity a lot, and often the need for a display GPU wastes one of your 4 anyway. I’ve built 2 4 GPU machines but I like 3 GPU much better because of the relaxed options and significant cost savings. That MSI board is an awesome sweet-spot because of the onboard graphics.

And also follow Martin’s other suggestion: Make a simpler machine first. When you’re buying 4 $500 GPUs (or 4 $2500 GPUs!) the cost of the case and motherboard and such is minor, so it’s less risky to start small and just build up if/when you need to, reusing many of the same parts.


First off, this forum is one of the places where we have a community supported site that has suggestions for hardware.

Secondly, I just noticed that NewEgg allows one to make wish lists public. Perhaps that’s an even better place for this information. It’s trivial to create a list, and all the item information is readily available. Could the experienced hands here throw together some wish lists and make them public? I’ve taken what I’m planning for my first build and made it available to all (although I can’t find it when I search, either there’s a delay, or it excludes one’s own lists from the public roster it shows me.) List is numbered 12889485, and called CUDA: One card Lunchpail CUDA.

I tried to put a long comment on it explaining my choices, but the comment is limited to 1000 characters.

If this works out, perhaps we could convince the moderators to let us put up a sticky post with links to these builds, and some explanations of the design choices.




I think I found the list you mentioned:…Number=12889485

I couldn’t see your comments there, though.

Your effort is very helpful. I will certainly use the list and try to update it when I build my rig.

Obviously, any coherent source of information is better than a bunch of scattered forum postings, which is all we have now (besides your list and the Tesla-oriented page referenced in this thread). I do think, however, that it would be very nice of nVidia to support a wiki site similar to, e.g. I think this kind of a task is exactly what wiki has been designed for and it worked out pretty well for me in the past at work.

Yes, my notes aren’t visible until I log in, and I don’t know how to make them public. I tried putting comments on it, but it wouldn’t let me rate my own list. So here they are:

The goal was to make a small and inexpensive system that would hold one CUDA card of any design. I don’t have a lot of space in my home office, and want something to learn how to do CUDA on. I haven’t build this machine yet, but will soon, and will refine the list and my comments afterwards.

Mobo: the mobo was the only micro ATX mobo at NewEgg that had a full PCIe 2.0 x16 slot and onboard nVidia video. One might be able to find one with 2 PCIe slots, but I’d be worried about sticking 2 CUDA cards in a box this small unless they were older designs like a GT220 or GT240. And using ATI/AMD video has been linked to issues involving cursing and handwringing (per SPWorley,) so I’m staying away from those mobos (however large in number they are.)

CPU: this Phenom II X2 is a great value. Many people have had success unlocking the other two cores, and getting a 3.2 GHz 4 core Phenom. If you want a guarantee of this, spend an extra $50 and get the X4. If you feel the processor isn’t critical, and don’t want to mess with unlocking, and prefer guaranteed stability, look at the Athlon II X3 or X4 processors. I like the Phenom better for the larger caches, which I think (but don’t know) will improve data transfer performance.

Power Supply: the Antec I chose has a beefy single +12V rail. It’s overpowered, but with the PS drawing air in from the hot case, I don’t want to put too much strain on it. And it could potentially handle the nonexistent two chip GF100 card (which may never exist due to power and heat limitations.) And it currently has a mail-in rebate as well, so the price is comparable to a mid level 500W unit.

Case: It’s small, inexpensive, has a handle, and a vented side panel. Installing a 120 mm intake fan in the front (as some of the reviewer comments state) is a easy win, although the install isn’t trivial. I suspect this case will be adequate for a single card system, and won’t take up the room that a mid size or full ATX case will. Also, my rough measurements of the interior indicate that it could take a PCIe card as long as 11.25" (I measured 11.31" from the PHOTO, not the actual case.) So the nVidia 9.5" to 10" cards probably should fit fine.

Other components: not critical. These look like a decent combination of price and good ratings when I put the list together.

Anything missing? Any obvious problems?