GTX260 X2 is coming in january

The die-shrunk (ie, cooler) G200 and G200 X2 (aka GTX295) are coming. In january.

http://en.expreview.com/2008/12/03/dual-gp…rce-gtx295.html

I’m just thinking about how this card will be recognized by CUDA? Any ideas?

Do I just see the doubled MPs have the doubled memory bandwidth?

Or is it the other way around like two separate devices?

Would be a great card for my simulations.

Any comments highly appreciated!

It’s just like the 9800 GX2. You see it as two separate cards, and you need to write your software to be multi-GPU.

According to this article, the GTX 295 has 480 total cores, meaning it’s basically two GTX 280s shoved in the same form factor.

Any idea which source is more reliable? Any official press release from Nvidia?

So the cooler card should essentially be a big step in the direction of releasing a mobile solution based on the GTX200, right?

bit-tech has their hands on a GTX 295 engineering sample http://www.bit-tech.net/hardware/2008/12/1…tx-295-1792mb/1

Two PCBs sandwiched together with a fan in between means that these puppies will get HOT.

In future releases of CUDA (2.1 or >), will there be transparent support for SLI or these dual card configurations? It would be nice to write the program for a single card and then just let SLI do the management.

No. Constantly accessing the device mem of the other card would kill performance. (It’s not like SLI is all that efficient either.)

But NVIDIA definately has to better support multiple GPUs at the CUDA level. At the very fn least let one thread access two cards (and permit device-to-device copies). But maybe something more interesting could be thought up.

According to that 2nd article, it’s got 432 cores, ie 2x GTX260-216

It’s always fascinating for me to see the evolution of the package and cooling solutions.

If you take a look at the pictures in the article, you see several new innovations. The fan is similar, but only the rear part of it is open. There is a hole in the bottom pcb to let more air into the fan. There are now vents at the top of the card to let more air escape (so the fins inside aren’t perfectly parallel, i’d guess).

Every time you think this cooling technology has reached its zenith, they get new ideas. Besides throwing more heatpipes inside, what could they possibly do to improve?

there is a LOT to improve on the stock cooler for any card. watercooling it will half the temps.

Watercooling on a swappable card? Not in this decade (or the next). And I really don’t think you can improve much on the stock cooling. Does the GTX280 even have aftermarket coolers?

P.S. How’s windows 7 treating you? Is it useable for day-to-day? (And is it possible now to turn off all the FN aqua highlights? I always have to turn aero off because of them :()

@qazax,

On a lighter note : I guess your code will have 1:100 as code:comment ration… :P

Alex,

I just bought 3 watercooled gtx280 for my new system (btw, it wrks ok, but I need to cool my northbridge better, it has a crappy fan on the nForce mobo).

I considered returning one unopened card, and waiting for the gtx295s, then switching to them for any future expansion. I did not… I don’t believe watercooled 295s will be available soon enough for me. If I wanted a rack-mounted and presumably very noisy solution, I could have a Tesla already. But for a desktop, watercooling is… so cool :-)

pawel

to mobile one of this devices you need a TDP of 60W or less i guess, so 350Mhz on a GT200 55nm would be the gold mobile GPU probably…, i can be mistaken…, 350Mhz GT200 would have 100W more or less…, can you disipate that energy in a small heatsink?? not easy… , basttery life would not like it though…

They could underclock the GT200 to get the required TDP and still have double precision available for CUDA applications on the laptop.

Last time I looked at water cooling, what struck me was the really low MTBF on the pumps. Something like 20k hours. Of course once your pump dies, so does everything else. What’s the solution to that?

20k hours = just over 2 YEARS. I have a water cooled system that I’ve had the pump running for over a year continuously without any problem - with the pump that is. I did have a problem when a fitting got cracked (apparently someone was trying to vacuum behind the desk…) , causing the tank to spring a slow leak however >.> Thankfully, the tank is outside the box, so no real damage occurred, unless you count the carpet. Since the tank was out of the way, I didn’t become aware of the leak until after the tank was nearly dry, and the motherboard’s built in overheating siren went off. Even then, since it’s an always on system, it might well have been hours before I walked through the room and heard it. Nevertheless, the system survived with no problems.

The point is that components are tough enough to run, at least for a while, with compromised cooling systems. Long enough for you to notice something’s wrong, and power down to replace the pump. The thing you should worry about with water cooling is leaks.

dont be so negative and depressing, of course you can! have you seen quick-connect connectors for watercooling systems? you can unplug them and it doesnt leak, google it if you are unsure. all you need is a few of them going to the graphics card and there you go, you can have SLi systems with splitters for the tubes so that you can dissconnect one graphics card and the other will still get water to it. most things are possible if you think about them enough.

yes there are aftermarket coolers avaliable for the gtx 280, im looking at waterblocks as heat is limiting my OC - the VRM on my card gets really hot.

windows 7 is fanastic, i cant wait till its released properly. hmmm, not sre on theaqua highlights, i know what you mean but i just got used to them on vista so havent noticed in windows 7. im sure if you are that bothered you could tweak something :thumbup:

You’re right, I guess I was a little too quick about watercooling. The quick-disconnect fittings are very interesting. The 1ml they release is very manageable.

The 2 years life expectency of the pump, though, is a huge problem. You have to understand I’m not looking at it like an experiment I’d do on my rig, but something more mainstream like using it in a cluster. 2 years is way too short. If you deploy it on ten machines, one will break within months. But most datacenters have a supply of cool water. Does anyone know what pressure it is at, and if can be used without an additional pump?

What intrigues me is that Apple used watercooling commercially in the G5. Does anyone know how that worked?

It’s hard to find accurate failure rates for the G5 liquid cooling system on the web. Certainly, whenever it did break the failure was spectacular. (So Internet reports are going to exaggerate the negative here.) Usually the entire motherboard had to be replaced.

This page has a reader survey:

http://www.macintouch.com/reliability/pmg5.html

but it is really hard to interpret the numbers in terms of a failure rate per year. The liquid cooling failure rate does not appear to be a disproportionately large source of repairs compared to other things. It does add yet another source of problems about on par with optical drive failure rates, but with a much more expensive repair. It is interesting to note that Apple’s 2nd generation cooling system on the quad G5 did have fewer problems (but that might also be due to the release of these units being closer to the survey date).