External GPU for laptop… This thing makes sense if CPU-GPU memory transfer is minimal, which is true for those who don’t use CPU to compute at all… and it makes a lot more sense if you have only a laptop which has no cuda-capable card and are as broke as I am so another desktop would be unaffordable…
My laptop has a ExpressCard 54 slot and 2 mini PCI-e slots. Any idea how to connect the pci-e x16 GTX 275 to those x1 slots?
Actually I found a lane converter that converts pci-e x16 input to a pci-e x1 output. It’s here: http://www.orbitmicro.com/global/pexp16-sx-16-1-p-1648.html
But I’m not sure if that converter fits my mini pci-e slot or not… Of course I’d need some extension cable since that thing couldn’t possibly be crammed into my laptop, the thing I’m worried about is: Would the GTX 275 be able to work with a single lane? Would that pci-e x1 output be compatible with the mini pci-e standard?
I’m also looking for some adapter that could take in the pci-e 16x GTX 275 card and connect to the ExpressCard 54 slot. That way I wouldn’t have to open the back lid to access the mini pci-e slots. Do you guys know of any of such adapters?
any help would be greatly appreciated!! And I guess if there’s a good(cheap) solution to this, a lot more people would jump up from their seats and scream in excitement… :D
Power issues will make this unworkable. A card like the GTX275 expects to be able to pull 75W from the host PCI-e slot, and 75W each from the 2x6 pin external power connectors. There is no way your laptop can supply anything like 75W from a mini PCI-e slot. There are ExpressCard based PCI-e docks available, like this, but they are expensive and limited to 55W cards, which is nowhere near enough for a full sized GPU.
Oh… I checked on that magma thing before. simply turned away by the ridiculous price.
How does the card get so much electricity without disturbing the signals from the gold fingers?
Would it be possible that I supply more energy from the external connectors and thus eliminate the need for that 75W from the slot?
Hi seibert, yes you are right that buying a used PC seems like a simpler solution, but it would cost at least 150 dollars, I suppose? Actually if this thing just doesn’t work at all then I’m gonna buy a used computer and a GTX 275. But if there’s a cheaper solution I might be able to buy a GTX 295 or even a new Fermi card!!
Goldfinger had something to do with broccoli, didn’t it? IIRC, there are four 12V lines on the PCI-e slot, so that is about 1.5 amps per pin, which looks reasonable to my non-expert eyes.
I very much doubt it. The power plane on these boards is rather complex, and usually different segments of the DC-DC circuits are fed by different lines. So you can’t just magically “supply” more energy to one part of the DC-DC converter array in the hope that another will draw less by some miracle of meta-physics.
Shuttle has announced an external graphics card station that provides up to 90 Watts to the GPU and connects using a proprietary PCI-e x1 connection. As avidday has said already, there is no possible external solution for the power needs of a GT200 GPU. And, as seibert has said already, recycling an old PC makes far more sense if you are short on cash. If you want a third opinion, here is Sanjin RadoÅ¡ comment at the end of the above article:
“It’s well worth noting Asus has tried this concept before but it decided to drop it as it didn’t result in great performance and PCIe 1X was simply not enough, at least for Asus.”
Other than wanting to spend $300 or so on a GTX 275 GPU, but not on a PC to put it in, you haven’t exactly stated a problem that Nvidia needs to solve. If you are asking how to write and debug CUDA programs without having a CUDA-compatible GPU, Nvidia describes how to do this in section 3.2.10 of the CUDA Programming Guide here.
+1 for something alike. Recycling an old PC is not the best solution sometimes - this is using too much space, making too much noise, creating an additional hassle regarding the administration, etc. So an external case for a PCI card, with as fast connection with the laptop machine as possible, would be the perfect solution.
On the other side, possible alternative would be to use one of these HPC-on-demand offerings that started to appear recently, like POD from Penguin Computing, or Cyclone from SGI.
So you are looking for something like a mini-S1070 (or an S1070 itself - the host adapter should fit in most laptops). Once you are no longer trying to save money, all things are possible. The main thing to keep in mind is that the device you are proposing needs to be good at playing video games, or else it will be DIY or expensive (or both). HPC is just not a mass market. And HPC developers, even less so.
Where would the adapter fit in ? I think there is a huge market for this thing. Most people i know have moved to laptops, but still want to connect to a big screen when in the office, or even more so multipule screens. CPUs in 13 inch laptops are plenty strong these days even for heavy development but the graphics … the best you can hope for is a 9400m in such a small laptop. I know of a few people who would love to buy such an external gpu setup if it existed for a normal price (say 50-100 dollars overhead over the price of the gpu) The thing is its a chicken and egg problem, for it to be successful alot of laptops need to support it, this new pci-e x16 external interface. But they wont do it until there are products and people buying it …
To put this “50-100 dollar” overhead into some perspective, Nvidia make a deskside box called the QuadroPlex 2200 D2, which is a self powered chassis containing a pair of Quadro FX5800, and it comes with a PCI-e x16 host adapter card and cable to hook it up to the host PC. It isn’t that far from what is being suggested here - externally powered and hosted card(s) with a host adapter card. It runs at about $10,500, while an FX5800 will set you back about $3500. So in this case, NVIDIA aren’t charging 50-100 dollars to cover the engineering effort of building, testing, certifying and supporting an external enclosure for a compute 1.3 GPU, they are charging about $3000 for it.
What is being asked for is a boutique product which probably sell in miniscule volumes compared to mainstream AIB kits. It won’t be $3000 over the top of the cost of the card, but it won’t be $50-100 either. And at the end of it, all you will have is a potentially fast GPU hanging off an express card 54 link, which would be (at least in my opinion), somewhere between an engineering curiosity and useless.
Or to put it another way: this idea is not crazy (as long as both laptops and PC gaming continue to be popular) but step 1 is for laptops to get a low latency, high bandwidth external link that can handle at least 2-4 GB/sec. Bonus points if it turns out to be PCI-Express itself.
Once you have that, then I could see something like NVIDIA’s Optimus technology and a $400 box (+ card) accelerating PC games on a laptop. CUDA would come along for the ride.
But without the external link speed, I’d say this is a non-starter for any commercial product.
the quadro plex solution is a … wait for it quadro product. These products are priced in a completely different way. On that line you can also expect a gaming gpu based on the same silicon as the quadro 5800 to cost over 3000$ ? well the 285 gtx costs a bit less …
We are talking about a consumer product, which i believe has some huge market potential, for the reasons i explained earlier. I think the big challenge is getting laptop makers to create laptops with a external x16 pci-e interface. The actual external box is fairly simple to put together and there are actually parts you can buy to build one yourself out there already for less then 100$.
erdooom’s link is cool! seems the people there got it working! but i’d need some data mining tool to get a better understanding of that sea of replies.
Anyway… I shall wait until fermi comes out and then check back on that link… seems there’s still progress going on there.