Is the G92 Double-Precision Capable?

Is the upcoming G92 chip going to support double-precision floating point in CUDA or will we need to wait for the generation after that?

According to information on this forum Tesla based on G92 will be double-precision capable. Other cards will not.

It appears ATI now has cards out that support double-precision: http://ati.amd.com/products/streamprocessor/specs.html

NVidia will probably release one any minute now :)

Does anyone have any news/rumours/wild-speculation about this?

Some codenames were released for the new DX10.1 cards recently, but there was no mention of release dates which I’d imagine put its well into 2008.

If there’s no double precision NVidia card this year I’ll probably be forced to defect to the other team…

The number of transistors grew a lot in the G92 series, this might be due to that they started to support double precision (I can’t think of any other reason), and it makes sense that we don’t know yet, as they said they will only enable it for the Quadro high-end G92-based cards. We’ll have to wait until those are out. I don’t think it’ll be long.

G92 does not support double precision

Yeah I would be very surprised if it did. They’d have been shooting themselves in the foot not to market it as supporting double precision if it did…

So the real question is when we can expect to see a DX10.1 part out of NVidia.

G92 has more transistors because 1) the off board elements of the G80 were brought back onboard and 2) it features the new PureVideo HD 2 Engine.

Is GTX highend and has double precission?

looks like it is in stock!

http://hothardware.pricegrabber.com/search…=geforce%208800

Lars

The 8800 GTX has a G80 Chip and is single precision only with CUDA compute capability 1.0

None of the NVidia cards have double precision yet,. certainly not the 8800GTX