External nVidia GPUs for laptops through Express Card slot? Seeking vendors

I’ve seen several ads for laptop based external graphics solutions that would connect via the Express Card slot on the laptop and provide a cable based PCI-Express link to a box containing a GPU and some video RAM.

As an example, an ATI based product sold by Fujitsu Siemens is found here:

http://www.fujitsu-siemens.com/home/produc…ic_booster.html
http://sp.fujitsu-siemens.com/dmsp/docs/ds…phicbooster.pdf

Their product page doesn’t explicitly state whether they connect through Express Card or USB 2.0, but I believe Express Card is more realistic. According to their spec sheet they achieve PCI Express 8x connectivity. Wow - I did not know this was possible through cable. Maybe they added a proprietary interface to their laptop model because compatibility is only stated for one particular laptop model.

Now to my actual question:

Would anyone know if any external nVidia based solutions are available? My employer would consider offloading some engineering processing to the GPU, however we need information about vendors who provide nVidia GPUs in an external box. Of course bandwidth is of some concern, so we need to run some tests to figure out if the limited bandwidth on the Express Card (or USB 2.0) would be sufficient for our purposes.

Christian

no

just get a desktop. so much easier.

there arent any external solutions by nvidia apart from tesla boxes that are designed for very process intensive workloads. they however link to PCs only, not laptops. laptops simply arent made for high end data processing.

On further research, I found one vendor using the Express card interface, but they’re not stating whether they ship ATI or nVidia GPUs inside.

http://www.villagetronic.com/vidock/index.html
http://www.villagetronic.com/vidock/media/ViDockGfx.pdf

Here is a review with some hardware photos (PCB shots)
http://www.tomshardware.com/reviews/vidock…ics,1933-3.html

The drawback is the limited Express Card bandwidth of at most 2 GB/sec. Giga…bytes…or bit?. I think it must be 250 MBytes/sec corresponding to 1 PCI Express lane.

Christian

Dell Precision M6400 cough. High end price and high end GPU.

so… doesnt your laptop have a Quadro FX 3700M in it?

We did ship a lot of Fujitsu Siemens laptops with Intel integrated graphics hosting our engineering application and the idea is to add an external GPU box enabling some more advanced simulations in realtime. Ideally the laptops don’t have to be replaced, but can be “upgraded” with some kind of magic box.

Christian

and you didnt check to see if this “magic box” existed before you purchased all those laptops?

i see your reasoning, but it may have been better to check first…

does the software you are using even support cuda enabled graphics cards?

We’re working on some CUDA magic now. Our application has been shipping for three years in various forms, so that’s way before CUDA even existed.

With the looming financial crisis, some companies have imposed strict budgets on new hardware purchases, so we’re aiming for upgrades, rather than complete hardware replacements where feasible.

Don’t dismiss the need so easily. I would also pay for such a box for my laptop!

There’s a ton of uses, but the most obvious one is portability… travel to a conference to give a talk, you can demo on your laptop.

Same with traveling to a client or demoing at a studio, whatever.

What’s the alternative? Lug your own tower around and hook it all up for a 5 minute demo?

Or intrude on someones machine at the site and hope it has the right hardware, and use time to install drivers and software?

And in fact I use my laptop for CUDA development all the time and I would pay serious $ to get some G200 support, including plugging in an external box like this.

Sure for your desktop machine, buy a card, but actually this kind of hardware innovation is exactly what will make GPU computing easier and more ubiquitous.

i can see that you guys would appreciate this, but the question was is there a “magic box” that does this, and the answer is sadly no. wanting it a lot sadly doesnt change this fact.

http://en.wikipedia.org/wiki/ExpressCard

… so it should be technically possible.

In fact, from the ViDock review, http://www.tomshardware.com/reviews/vidock…ics,1933-4.html , we see that this has actually been done.

Not too good bandwidth though. Hacking a ViDock could be an interim measure, but you’d still need to feed external 12V power to the card.

If I were to try to build a luggable demo machine, I’d put a GTX-260 in a headless Shuttle XPC case, and lug that plus a tiny laptop around. The laptop serves as a “smart” keyboard + monitor for the worker.

Retrofitting laptops for dev use is really, really hard I think. A cheaptastic worker box + some nice remote desktop software to “phone home” gives you the best of both worlds.

I thought about this already, but no shuttle cases take the full 10.5 inch length of a GT260. And the double slot width is also rare.

Shuttle is just coming out with new systems soon to solve these problems. (They showed them at CES!)

I asked at a hardware forum last month, and the suggestion there was to build my own desktop luggable based on a microATX motherboard.

http://techreport.com/forums/viewtopic.php?f=22&t=64014

Edit: Followup, the Shuttle boxes that support the G200 cards are out as of yesterday!. About $3000 for the system. http://us.shuttle.com/H7_G4500SDXi.aspx

I am not a guru in SAAS. But jusss my 2 cents …

Consider launching your new software via SAAS. Provide your clients with SAAS interface and have a GPU cluster in the background (@ ur company)

Write a SAAS client for your customers which will submit jobs and display results – the way your customer wants…

And, your customers can pay per use or whatever…

Hehe, problem is our clients take that software to trade shows, display it to their customers on site, or use it for training. Often there’s just no reliable Internet available on site. Everything needs to remain portable (as portable as a stack of three laptops can possibly be) and be independent from any Ethernet jack or Wifi connection.

Christian

Aah… I see.

I still remember the problem that we faced in a CUDA roadshow recently. Our multi-GPU library was NOT working as expected on the personal supercomputer (4 TESLAs + a low-end NVIEW card). The NVIEW card was very very low end (with only 1 MP) and we were Load-balancing as if all GPUs are of equal power. (Strength of chain is the strength of weakest link – not necessarily though if u design to accomodate the weak link).

After re-compiling the DLLs on my laptop, I realized that USB has been disabled in my laptop (company policy). There was no internet as well to transfer it. Finally, we used bluetooth to transfer it to my cellphone and use the USB cable of the cell phone to transfer it to the target computer… That was a bit of drama… BUt it was worth it. We got 500x to 1300x speedup for some of the financial algorithms compared to single core of i7.

I can understand your problem.

But with cellphone internet booming, i think the day will NOT be long before

  1. Parallel-processing powered cell phones participating in the FAH project.

  2. SAAS via internet over cell phone

3G might solve all these problems. Lets see…

Best Regards,

Sarnath

Here’s one I found:

http://www.magma.com/products/pciexpress/e…box1/index.html

More info here (look at the citations):

http://en.wikipedia.org/wiki/PCI_Express#E…CIe_Video_Cards