Hi,
I’m on Ubuntu Gnome 15.04, using the suggested “additional drivers”. I have a 24" main screen with a secondary 10" screen.
Within NVIDIA X Server Settings, in the Powermizer tab, 3 performance levels are shown:
0 -> clocks 50 MHz 270 MHz and 101 MHz for Graphics, Memory and Processor respectively
1 -> clocks 405 MHz 648 MHz and 810 MHz
2 -> clocks 795 MHz 4008 MHz and 1590 MHz
Powermizer let me to chose between three options under the “Preferred Mode” select:
Auto
Adaptive
Prefer Maximum Performance
Auto is selected. And “Current Mode” shows it is in Adaptive mode.
My problem is that, whenever the secondary monitor is enabled, clocks are always at performance level 2 at desktop, while doing nothing.
If the secondary monitor is disabled (either in “NVIDIA X Server Settings” or in “Ubuntu → Settings → Screen Display”) everything works as intended and clocks adapts to load.
In Windows, I solved this issue with a software called NvidiaInspector which has a “Multi Display Power Saver” tool that forces iddle clocks with dual monitors.
So in conclusion, my question is:
is there anything I can do in linux to have that feature from Nvidia Inspector?
This is unfortunately a known issue that probably won’t be fixed; I’ve asked Aaron Platner about it, though it usually only presents itself when using three monitors, rather than two. Essentially, the driver will set fixed clocks if the X screen is large enough that there’s a danger of it underflowing during the time it would take for it to re-clock the GPU memory.
Two monitors is definitely getting to the point of being unreasonable, though, since that’s hugely increased power consumption. I wish they’d look into a fix for this; though it’s possible this is on the list of things that would be fixed by Wayland (as opposed to the long list of things people would like to think can be fixed by Wayland but aren’t).
Yeah, I guessed it. To be honest, I hadn’t even thought on a fix. I was already counting on it not being fixed, since it’s a quite well known Nvidia problem in general with all drivers in any OS since a long long long time ago. In Windows it’s the same: full clocks any time a second monitor is plugged in and enabled.
I’ve read that sometimes clocks scales fine if monitors are same resolution but can’t reproduce it here. If I apply a 1920x1080 resolution in my Lilliput 10", the clocks/temperature/consumption problem is still there. Always performance level 2. I guess that since the native resolution is 1280x720, it does not work. Maybe monitors of same size are needed for that to work.
The big difference is, in Windows at least we have Nvidia Inspector that lets us to fix the clocks to performance level 0 and apply higher clocks only when certain processes are running or if video/core loads are above a threshold.
I was wondering if there was a similar tool for Linux. Or some kind of script to achieve the same. Or if it was even possible to either ‘port’ that tool (chaos?) or build a similar one from scratch (gargantuan task?).
Considering the use I’m giving this Ubuntu partition (only web developing, ie text editors and browsing), I can hardly believe I’d never need anything above performance level 1. So I’d be even happy with a way to fix clocks manually through some console command (script).
I’m not familiar with an Nvidia Inspector-like utility for Linux, so I know I’m no help there. I actually didn’t realize it was possible even on Windows.
From my understanding (and experience), for Fermi cards if you have two displays of different resolutions or refresh rates, it runs at highest clocks. For Kepler and above I believe this limitation is not true, except for some of the situations with three monitors as mentioned above.
I can say that in my experience with a GTX 570 SC, when I ran with 1920x1080 and 1280x1024, the card ran at highest clocks. Now I use two 1920x1080 displays and it clocks down to the lowest level.
FWIW, I get proper re-clocking when I have two displays (1680x1050 and 1280x1024) connected to my Titan X, but adding my third display (a 720p projector) always kicks it up to max clocks. Which is a difference of like 150w. So needless to say I don’t leave the projector connected all the time as I’d like to and hotplug instead, which is kind of a pain, but not a dealbreaker for me as I’m almost never using it simultaneously with the others in practice.
I believe there is a way to force Powermizer states via the command line if you’re willing to do a bit of research on this forum and in the driver documentation … I haven’t wanted to go that route, though, because I do actually want the dynamic re-clocking sometimes, and for the Titan X there isn’t really a middle option (it has states 0, 1, 2, and 3, and 0 and 1 are almost equal, and 2 and 3 are almost equal, but there’s a massive gap between 1 and 2).
Hum. I’ll look for that way to force Powermizer from the command line.
If it exists and if there is also a way to retrieve video core load and graphic core load, I can think on making a very little daemon that can check in the background for these values and set Powermizer status accordingly.
In the case of pravuz (the user who opened that other thread), clocks didn’t scaled. Just got stuck at minimum performance level and had to manually change performance levels.
In my case, clocks scale. Now it’s at performance level 0 at desktop and performance level 1 kicks in right after I open a big video in VLC (small videos still use level 0).
I haven’t tested anything to get level 2 yet. Maybe I can try a SteamOS+Linux game. I should have one in my Library. But I’ll probably end up uninstalling because, as I said, this is a work partition. I’ve Windows for gaming after all.
Interesting, so you’re saying that purely by specifying in Xorg.conf that you want a default of 0 but adaptive clocking enabled you can get it to stay in state 0 but still re-clock, more effectively than if you didn’t use those configuration parameters at all? I’ll have to try that…
In my case, I used PerfLevelSrc=0x2222. Which is strange because it means fixed clocks. Maybe it means fixed clocks for each performance level, not that Powermizer is fixed to one performance level.
If I type this console command:
nvidia-settings -q GPUPerfModes -t
I get a base MHz, min MHz and max MHz for each performance level. Maybe that wiki is not 100% accurate in that term and the 0x2222 means that each performance level only uses its base MHz. But that may be just me thinking a bit too much.
Anyways.
I tried an Steam game and it gets stuck in performance level 0.
But some videos in VLC (big ones) triggers performance level 1.
Strange.
I will try a GPU stress test to see if that makes the performance level 2 to kick in and it was a problem with that particular game (Trine). But as I said, if it doesn’t, I wouldn’t mind. This Ubuntu Gnome is for programming only.