MacBook Pro and external card? I need a Compute Capability 1.3 card on my MacBook

Hi all,

I posted a similar question to the main CUDA forum until I saw there was a Mac-dedicated forum, but I thought I’d retry here.

I need to use a CC 1.3 capable card to get double-precision processing. I’d like to do this on a MacBook Pro, so am looking for some way to run an external card. I figure ExpressCard is the way to go (I’m not worried about bus speed), but so far the only thing that’s come up from my search and the other post is a $2400 expansion chassis from Magna…might as well buy a new Mac Pro!

Does anyone have any other ideas? Much appreciated, I’d rather keep my MacBook than have to trade it in for a Mac Pro.

Cheers,
Michael

Yeah… Sign up for the ADC, then buy the Mac Pro at $1999 instead of $2499. Using ExpressCard with a GTX 285 sounds like a world o’ hurt. By the time you get the secondary power supply jury-rigged and IF everything works over a PCIe 1x connection, you’ll still have one big hack that nobody will support.

Life is too short…

Oh for a mobile 1.3 GPU. Currently put Mac Pro on tea trolley to wheel to lectures. Let me know if you get anywhere.

The cheapest option I’ve found so far is a Magma EB1 board set (i.e. just the board with no PSU or enclosure) for $600 from Magma themselves. The retail EB1 is a one-slot PCIe expansion unit using ExpressCard. But it can’t fit double-width (i.e. the Nvidia GTX 285) and only has 60W PSU. So with adding an PSU and enclosure, I figure it will be under $1000. But not guaranteed it will work. Magma says the EB1 has been tested with MacBook Pro’s, but for stuff like audio cards, not GPU processing. I’ll let you know if I actually try it!

M

It’s probably going to be a while until there is a MacBook with compute capability 1.3. The fastest mobile GPU in gaming laptops right now is the 260M which is a rebranded 8800, so I figure it will be about 2 years until the g200 architecture makes it to mobile devices. And then another two years until they come down in price enough for Apple to use them in the mainstream laptops.

Have you considered running your code remotely via ssh? Even if you don’t have a Linux box now that you could run your code on, I am sure it would be a lot cheaper and more reliable than jerry-rigging the Magma EB1

The 40 nm mobile GPUs due to be released later this year are going to be compute capability 1.2 parts. They will have all the features of the 1.3 devices except for double precision. So we’ll be part way there in 6 months, hopefully.