Hello guys,

the G80 GTX has around 518 GFlops (http://en.wikipedia.org/wiki/G80#GeForce_8800) which i know is a theoretical value due the arithmetic unit is shared between the 8 SP in a MP. But still if i use the follow calculation (128 SPs * 2 (op per second) * 1.35 Ghz) == 345 Gflops.

What happens with the (518 - 345) 173 GFlops ? Again, i know that this 518 is a theoretical value but 173 is really a lot. Is this the PTX Virtual Machine overhead ?

thanks,

jj