The GTX 295 has a theoretical 289W max powerusage, but CUDA doesn’t use all the transistors, right? What’s the max powerusage I can expect running CUDA programs (image analysis in my case, 1000 fps megapixel files…).
Does this apply to the other GTX cards?
It likely depends on your app… you may get different power use if you effectively use both compute and memory transfer at high utilization.
It may also depend on the cards, themselves, since their different balances of memory speed and bandwidth versus the GPU can affect it.
I will give you one datapoint at least (taken with the previous caveats). I’ve directly measured power use of my apps running full-bore on a GTX295 (using a Kill-o-watt meter, very handy). The power usage was exactly 205 watts with both GPUs going full. (this is compared to the wattage used when the card is removed from the machine.) Wikipedia says the TDP for the GTX295 is 295 watts. The PSU is likely 85% efficient so probably the GTX295 is using 185 watts.
What would a GTX295 really use in a heavy duty game? This article shows the card using 220 watts for the GTX295, but that’s the delta between idle and full game load, not between NO card and no game load. Guessing, from that same chart, that idle on the (dual) GTX295 is 50 watts… then my app uses only 75% the wattage of the card as compared to a game.
The previous analysis is anecdotal, and based on one datapoint, and isn’t even really scientific. But as a quick approximation, 205 watts is a better guess than none.