PowerMizer power levels stuck at max on GTX 650 with 319.xx, 325.xx drivers

Hmm. Interesting. But seems to be false? I just unplugged one of the DVI monitors and it seems to be functioning correctly??

adamm@mira:~$ xrandr --screen 0
Screen 0: minimum 8 x 8, current 1920 x 1080, maximum 16384 x 16384
VGA-0 disconnected primary (normal left inverted right x axis y axis)
HDMI-0 connected 1920x1080+0+0 (normal left inverted right x axis y axis) 708mm x 398mm
   1920x1080      60.0*+   60.0     59.9     30.0     24.0     60.1     60.0  
   1600x1200      60.0  
   1360x768       60.0  
   1280x1024      75.0     60.0  
   1280x720       60.0     59.9  
   1024x768       75.0     70.1     60.0  
   800x600        75.0     72.2     60.3     56.2  
   720x480        59.9  
   640x480        75.0     72.8     60.0     59.9  
DVI-D-1 disconnected (normal left inverted right x axis y axis)
adamm@mira:~$ xrandr --screen 1
Screen 1: minimum 8 x 8, current 2048 x 1280, maximum 16384 x 16384
DVI-D-0 connected primary 1024x1280+6+0 left (normal left inverted right x axis y axis) 376mm x 301mm panning 2048x1280+0+0
   1280x1024      60.0*+   75.0  
   1152x864       75.0  
   1024x768       75.0     70.1     60.0  
   800x600        75.0     72.2     60.3  
   640x480        75.0     72.8     59.9

If I swap connections on the DVI port to with the unplugged monitor, it remains at low power too… What is happening??

In summary, what seems to be happening is (tested),

  • HDMI + 1x DVI - low power (either of DVI works)*
  • 2x DVI - low power
  • HDMI + 2x DVI - max power
  • Note: 1x DVI means any of the 2 monitors that use DVI connection in either of the DVI connection ports on the video card (all 4 combinations)

I’ve also tested 3 X Screens instead of 2 where one had 2 monitors. 3 screens (one per monitor) also resulted in max power scenario.

It is still unclear to me when any combination of 2 out of 3 works, but 3 don’t. Since any combination works, then it must be true that any timing that must match for any 2 must also match for all 3.

PS. I may have been wrong to say that it worked in the past. At the time, one of the DVI monitors may have been broken (its power supply failed). Anyway, the problem remains with all 3 for unknown reasons. It cannot be related to timings unless you are saying that “timings only matter for more than 2 monitors but don’t matter for 2”.

PPS. Please keep in mind that manually forcing power mode to low with those PowerMizer config lines that were posted before, forced the video card into low power state. The problem then was that it would not leave that state. It functioned perfectly fine, it was just slow, and hence not very usable on demand for things like OpenCL or OpenGL.

Problem remains with 340.24

So what is the issue here? Is nVidia 650 (Keppler) unable to actually use 3+ displays efficiently?

Why don’t I see problems (aside from slow performance, as expected) if I manually force low power mode as described earlier?

This bug is costing people real $$$ that are wasted on power costs needlessly. It’s been an issue for more than a year now, yet any explanation is just hand waving boloney. The above questions contradict each other - if Kepler cannot handle 3+ displays efficiently, then why does it work just fine when forced into low power state via config parameters? Thus it can, then why hasn’t this bug been fixed?

All I can say is thankfully this is only a low power 650 card. This way it only burned an extra $20/yr needlessly, instead of $100+.

Hey Franster,

I believe I have the same issue as you. I recently picked up a third monitor and I didn’t notice this right away, although it’s possible it’s been present since I added the third monitor.

I have a 780ti gpu. My monitors are 2x VK266h @ 1920x1200 connected via DVI, and 1x PG278q @ 2560x1440 connected via DP.

Whenever I add a third monitor my powermizer Performance Level gets stuck at 2 (out of 3) as the absolute minimum. Even with nothing running, no composite, on KDE, LXDE, and XFCE. GPU utiilization is at 0%, no processes reported in smi. It doesn’t matter which monitor I add, it can be DVI or the DP one, whenever there’s 3 it changes my minimum powermizer level to 2. So :

1x DVI, 1x DP - works
2x DVI - works
2x DVI, 1x DP - does not work

I don’t believe it has to do with refresh rates and metamodes matching as sandipt stated above. As my DP monitor can be 2560x1440@140hz and my DVI 1920x1200@60hz and I still have a normal functioning powermizer that stays at performance level 0 durring normal usage. Even if I put all three monitors at 1920x1200@60hz (reports 59.95hz for all three) then PowerMizer still won’t go below Performance Level 2.

Is there something programmed in if you go over N connected monitors (2), or over X xscreen resolution, then it forces a higher performance level? If so then you should probably evaluate this, because at low or no utilization this GPU should be quite capable of handling 3 monitors. Franster showed this when forcing low performance mode (not an option for me) in xorg.conf. If not then I’d say this is clearly a bug that needs to be logged with nvidia.

A GPU running 24/7 drawing an additional 100watts cost users roughly 150$ a year in electricity costs to maintain. It also dumps roughly 350btu/hr into a room, which for a small room in a hot summer with low air exchanges - this could raise the room temperature up to 5°c. If you’re using air cooling it’s also heating up your motherboard, cpu, etc, and potentially slightly accelerating the degrading of components (albiet its probably trivial unless measured 24/7 over many years). If you’re using water cooling and have it on a loop with a cpu - it’s directly heating up the cpu as well (my case) about 4°c at idle.

This “bug” effects a lot of different aspects and I hope will not be ignored. Can someone from nvidia comment on this and/or file an appropriate bug so this can finally be resolved?

Thank you

OMG I just spent the last 4 hours trying to figure this out. I am having the exact same issue. Everything is great until i plug in the 3rd monitor, regardless of which port is used. I am at 0% Load and maxed out clock. I think we all agree that these GPU’s can easily handle 3 monitors without needing to waste energy and create so much noise and heat. PLEASE NVIDIA ANSWER US!!! I am taking an educated guess that this can be fixed with an update. Dont leave us hanging. - GTX 650 (An awesome card <3 (other than this))

Submit a bug report :


Link to this thread and explain the problem.

I did a month ago and havent heard back and there’s no updates. But if everyone having this issue submits a bug maybe it’ll eventually be fixed.

It’s actually probably cheaper for me to buy a second video card to run a third monitor… Ridiculous nvidia. All you have to do is answer us to let us know you’ve confirmed the problem and are working on it! Give us some hope you still care :-)

Unfortunately this is STILL a problem when running 3 displays on a Titan X with 349 … but I’m glad I found this thread before beating my head into it for too long, as at least now I know it’s a known issue!

just found this thread. i posted about what i think is the same issue here: