Very(!) slow ramp down from high to low clock speeds leading to a significantly increased power cons

I will bump this thread indefinitely.

This is a blocker for God’s sake.

Obviously drivers version 384.90 are affected as well.

I edited the original post to reflect this further regression.

nvidia really could add an “optimal” PowerMizer setting like on Windows drivers, for maximum power savings for e.g. mostly 2D desktop operation

but I doubt it’s realizable under Linux without APIs, lacking feedback from the desktop/GUI/X-server, etc.

I think it’s not related to desktop, on blender under windows I get my clocks down in less then second. Under linux, you know.
Today I noticed that I have 30sec cooldown under windows too. Clocks have been high for 30 secs just after moving cube in blender. Not sure what I did (maybe background drivers update?) and it was gone after setting power to adaptive and back to optimal and restarting PC. Not sure how long I was working with this.
This is so annoying. I’m on desktop, but I can’t imagine having laptop with this bug. That’s one of those things that stops me from switching to Linux completely and I have to say I’ve done lots of work to adopt my 3D workflow and new GNOME Linux desktop is so cool.
Fortunately good thing is, that Nvidia driver has same speed under Linux in blender.

EDIT: just checked and I’ve got same drivers from 18.07.2017 under Windows.

Bump ;-)

I want time to idle to be configurable via a module parameter. Please!

Let it be 36 seconds by default - OK (not OK under this sun but someone at NVIDIA “thinks different” TM).

I wonder who at NVIDIA I should write to to have this problem fixed.

This is still relevant.

We will get old and this won’t be fixed…

*bump

I’ve heard rumors that NVIDIA finally started investigating this issue. I don’t hold my breath though taking into consideration how long it’s been known and how long Aaron Plattner and other NVIDIAers have denied it.

I hope they do.

Hello.
I’m joining the club.
I’ve got a laptop with monitor outputs connected to Nvidia 1060 card thus forcing me to keep the card powered.
25-30W with just a terminal is brutal for both acoustics and battery life.

My educated guess is that this happens because Linux Desktop Environments use OpenGL for rendering (unlike Windows witch uses accelerated 2D engine) thus driver acts as it would for a game.
Since OpenGL is a single API we could have either slow games or hot desktops.
Nvidia obviously chooses performance. I would prefer power saving.
The solution is relay simple: Add a “Max Power Savings” to PowerMizer in nvidia-settings to lock GPU in a (configurable) low power state (P8 by default).

I finally found OverrideMaxPerf option in driver .o close to PowerMizerEnable and gave it a try.
Had to add it to /etc/modprobe.d/nvidia.conf since my xorg don’t start if a device section is present.

options nvidia NVreg_RegistryDwords="OverrideMaxPerf=0x1"

Works great and cool with 3 to 8W used.
The only downside is that if i want to play something I have comment the line, stop display manager, remove nvidia kernel modules, insert kernel modules and finally restart display manager. Needless to say restart to Windows is much quicker.

See my post #34. Made this test using blender under Windows (opengl) again, and power consumption goes to from 50 to 27W (system) in less then second after I stop moving objects inside viewport. Besides if you look into nvidia-smi processes you will get:

|    0      5148    C+G   C:\Windows\explorer.exe                    N/A      |
|    0      6536    C+G   ...t_cw5n1h2txyewy\ShellExperienceHost.exe N/A      |
|    0      6804    C+G   ...dows.Cortana_cw5n1h2txyewy\SearchUI.exe N/A      |

Windows have had used 2D acceleration before Luna times IMO.

Sorry for the delay. Can you please test 410.57 driver? How much time it is taking to reduce clock/power with this driver?

36 (!) seconds in previous drivers.

Haven’t yet tested this one.

I’m waiting for 410 to be available in PPA, will test ASAP and let you know.

I’m joining the club because I have the same problem… Unfortunately with the new drivers (410.57) it seems to be worse. Tested a few times and every time it took ~45 seconds for the driver to reduce clock. With the previous drivers (396.54.05 and 396.54) it was ~36 seconds. I have a gtx 1070 and I’m on Debian 9 + xfce, if that matters.

45 seconds? Looooooooool.

Why doesn’t NVIDIA just disable low power mode on Linux and/or adaptive power management?

How hard it is to reduce that time?
Maybe it is intended to do so? Maybe is not just a timer and devs aren’t able to figure it out?

Man, this thread is more than one year old!

…this is getting ridiculous, really.