Unreasonable high power draw on Linux compared to Windows

I noticed a huge difference in power draw of my Geforce 1060 (6GB) when running Linux compared to Windows.

Rough overview of the Hardware and Software involved:

  • Intel i7-3770
  • Nvidia 1060 (6GB)
  • Windows 10 (Build 1909) and driver 441.20
  • Debian Testing, gnome-shell 3.34.1+git20191024-1, xorg 1:7.7+20, Kernel 5.3.9-3 and driver 440.36
  • Everything running in 1080p@60Hz (locked at 60FPS when possible)

As far as the Linux side of this is concerned:
Distribution, kernel or xorg version, desktop environment or features like compositing don’t affect anything. I have also tested this on PCs equipped with slightly newer and/or older hardware and the behaviour is basically the same.

When running windows, I can pretty much do whatever I want (gaming, cad, photoshop, etc) and never get even close to 60°C GPU temp (with the fans running at 0 RPM) as reported by nvidia-smi. When doing simple tasks like watching youtube/netflix, using Firefox, etc, the card is spending almost all of it’s time in P8 (~11W, ~15W max.) and well below the 53°C mark.

When running linux, launching very simple applications like a terminal or gedit will cause the GPU to hit P0 (~30W at least). Switching between windows or virtual desktops will keep the GPU at P0, resulting in temps well over 60°C with the fans actually spinning up. Also, scrolling in text, moving windows or running other 2D only applications will also put the GPU in P0 (again, minimum ~30W). Doing actual work, running browsers or email clients, etc will keep the GPU pretty much all of the time in P0, which is insane for not really doing anything.

Since the 440.x series driver improved on this a little bit, I would be willing to invest quite some time and effort testing and providing more detailed information if anyone at nvidia would be interested in getting this fixed somehow. Running a text editor on linux should not put more load on a GPU than the majority of games running on windows…

Same problem here.
I have read many topics about this problem without any solution, never with an official answer.

Nobody cares about this problem.

It’s a long-standing issue:
https://devtalk.nvidia.com/default/topic/1002912/linux/very-slow-ramp-down-from-high-to-low-clock-speeds-leading-to-a-significantly-increased-power-cons-/
nvidia devs are aware and working on it but didn’t really got it done so far.
https://devtalk.nvidia.com/default/topic/1048768/linux/if-you-have-gpu-clock-boost-problems-please-try-__gl_experimentalperfstrategy-1/

If you don’t run heavy games in Linux you may use a workaround posted in the first topic generix has mentioned.

It will keep the GPU at the lowest power level at all times.

I’m not sure why NVIDIA doesn’t want to solve it - most likely desktop non-professional Linux users are not their priority.

As for professionals - they couldn’t care less about their GPUs power consumption.

I don’t think that’s the full story behind it. Using render offload, the driver is working as expected, boosting to max on application start, 2 seconds later adapting to the real gpu usage, throttling down. On application exit, again throttling down to min within 2 seconds.
So I have the impression that it’s not about throttling down but something in the driver always triggering boost in short succession when driving a real X screen.
Now the question is, if either the devs don’t see the elephant in the room or they took an architectural wrong way so they can’t change it without breaking things.

I’m aware of this workaround but I will not accept this as a paying customer!

This is also complete BS in my opinion. There’s absolutely NO reason why the GPU would need to boost to P0 for any period of time to launch any application, scrolling through text or simply switching focus from one application to another. The windows driver does not do this and even when forcing the GPU to stay in P8 all the time, the power draw is still higher (!!!) compared to windows and everything is working fine.

I would appreciate a statment from nvidia on this issue because this is unacceptable. I’m not willing to pay a premium for their hardware if I get sub-standard driver support out of it.

Does your GPU work under Linux? It surely does. Does it meet your strict power consumption requirements under Linux? Not really but … did NVIDIA promise you anything in this regard? No. Their primary platform has been Windows. Be grateful we have working fast relatively bug-free drivers for Linux because NVIDIA could as well not release anything at all, and you wouldn’t even get the nouveau drivers because it would have been impossible to create them if NVIDIA drivers hadn’t existed.

Also, this is just a support forum. Your resentment won’t affect NVIDIA in any shape or form. If you were a major company who had literally thousands of NVIDIA GPUs you could have asked them to resolve this issue ASAP but you don’t strike me as a manager of such a company and as an end user your options are to take it or leave it.

The topic where this issue is discussed was created over two years ago. NVIDIA are perfectly aware of the issue. The fact that the bug is still here means either it’s not trivial to solve or solving it requires too much manpower which NVIDIA cannot afford or chooses not to.

Quite frankly, no. I’m not grateful for sub-standard driver support for a product I spent money on and using it in a configuration officially supported by the vendor.

I’m speaking of applications that create their own GL/Vulkan context. In that case, there are lots of good reasons to boost initially since there’s a often lot of heavy lifting to do on init. So you couldn’t be farther away from reality with that opinion.

Would you care to elaborate on the meaning of “sub-standard driver support” 'cause I’m surely struggling to see anything substandard. There are quirks for sure but they don’t in any shape or form stop you from being able to fully utilizing your NVIDIA GPU under Linux albeit with a slightly increased power consumption.

Just to put it straight, there’s for sure no reason for being “grateful” in any way. Nvidia is making a good share of its quarterly income based on Linux and the least volatile btw. Gaming (~100% Windows, miners are gone): $0.9b - $1.7b, Datacenter+Automotive (~100% Linux): $0.8b.