Problem with P-states in Linux OS'es

I am the owner of MSI GeForce GTX 1080 Gaming X 8G and monitor SAMSUNG U28R55 28" Samsung U28R55 - Specifications
I have a question - why in Linux operating systems of the video card often switches to P0 mode? Absolutely in different moments.
Moving the terminal windows, opening the application menu, viewing video in Youtube (in different browsers). In Windows I can watch 4k video in P8-state!!! mode. That is, this is not the problem of hardware.
In the operating system Windows 10 I do not observe this. The difference in power consumption is huge. P8 - 14-20 watts, P0 - at least 48 watts.
Temperature in P0 state - up to 61C degree, but in P8 state in Windows about 37-40C!!! Smallest temperature in Linux i had 46C!!! It’s not normal.
I tested/measured on legal nvidia drivers from 385 to 470.
Linux operating systems: Fedora 34, openSUSE Tumbleweed, Ubuntu 16.04/18.04/20.04/21.04/21.10, Mx Linux 19.4.1, AntiX 19.4, Linux Mint 20.2, Debian 11 Testing, Devuan Chimaera 4.0 Alpha, Manjaro 21.0.7, Artix, ArchLab, RebordOS with different kernels - start from 4.9 and up to 5.13. All OS’es with different DE’s (XFCE, Gnome3, Cinnamon, Mate, KDE, JWM, FluxBox, IceWM).
So many years there is this generation of video cards, but the problem has not yet been solved. Why? What to do?

As I understood - no answer and no solution from Nvidia?
Start from 2019 year nothing new in this thread If you have GPU clock boost problems, please try __GL_ExperimentalPerfStrategy=1 - #50 by walmartshopper
So I have to sell my card and buy something from AMD to completely and comfortably switch to Linux?

So I have to sell my card and buy something from AMD to completely and comfortably switch to Linux?

You might discover that AMD cards don’t work too much better under Linux either because Windows has a completely different graphics stack and GPU drivers under it are working with your environment and applications a lot closer than under Linux.

But it is also impossible to continue. I need a system other than Windows for work. The user must NOT solve developer problems. And this is the problem of Nvidia developers. I tried a lot of environments, a lot of systems, I spent a huge amount of time. I do not understand why the videocard is to consume 50 watts in 2D mode (I repeat that I tried even the simplest environments such as IceWM type that even for Pentium 1). This is the problem of drivers and except for developers no one can and how I see now does not want to help. The main sell product and no matter how it works.
About the AMD - I have a portable computer HP Envy X360 LAPTOP - 15M-EE0013DX. It is completely on the basis of AMD and with power consumption and there is no problem with performance. Although the Linux support is not stated.
But I love working on desktop. And something will have to decide if the decision is not found.

You’re talking about “2D mode” while web browsers nowadays often use many 3D features to compose and output the resulting image on the screen. My 1660 Ti barely spends any time in P0, it’s occasionally P5 and mostly P8 even when using a web browser. And even in P0 I’m looking at just 25W of power being dissipated.

I’m using Fedora 33 and XFCE 4 without compositing, Firefox and Google Chrome (with GPU rasterization turned on).

In the end I have to agree that Linux NVIDIA drivers in terms of power consumption work a lot worse than in Windows. Been like that for a decade now and older GPUs didn’t have good power management at all.

Is there something new in my question?
I once tried several distributions and the problem was still not solved.
Window move in DE - power usage up to 46-50W about 30-43 seconds.
I even installed custom systems as Void Linux with the Nvidia Proprietary driver 510.
Nvidia do not care about your customers?

there are many threads with these questions. i asked the same question a year ago, and made comparisons between windows and linux doing simple things like watching a video in hd and the energy consumption was much higher in linux.

on windows rarely, my gpu would hit 3d mode and on linux it continuously hit 3d mode and took longer to return to idle mode.

my thread was never answered except by birdie i seem to remember.
nvidia doesn’t care much about linux, you just have to notice that we don’t even have a linux forum at nvidia, this is a developers forum.


NVIDIA doesn’t see much use case in ~1% of users occasionally dabbling with games, and their professional customers who do animation/rendering/CAD under Linux couldn’t care less about power consumption. I’m not defending NVIDIA, I’m just explaining their attitude which many Linux users find repulsive. I’m not a huge fan of that either but at least they support Linux which is still better than no support at all.

And when you think other companies treat Linux better, think again. Intel will only add support for Intel Thread Director in Linux 5.18 which will release in approximately 5 months, i.e. almost a full year later after Windows 11 which supported the feature from the get go.

If you deal with hundreds or even thousands of Linux systems with NVIDIA GPUs and threaten the company with the cancellation of the contract, they may as well address the issue swiftly. Other than that, we really stand no chance. Even look at these forums: quite a low activity overall, most people come, create a post never to be seen again. Compare that to their GeForce forums where there are hundreds of posts daily, each regression garners dozens of confirmations.

I told NVIDIA the idle power consumption had increased recently (not by a lot from 8.5W to 11W for my 1660 Ti) - what have they done? Nothing, didn’t even confirm the issue. How many people have replied to my thread? Zero.

If Google Chrome laptops start using NVIDIA GPUs (which sounds extremely unlikely) there’s a chance NVIDIA will treat their Linux drivers better. I see no other outcome. Linux on the desktop is too much a niche OS.

Maybe write to this topic as well though NVIDIA considers it “solved”.

1 Like

im using the MXM-P1000 as an embedded device, and I need to limit the PL power consumption using Ubuntu 20.04 command line only. it jumps up to 45W of power and increases the internal temp dangerously high. I need to limit the wattage limit, as well as the temp cut off threshold. any suggestions would be great