Nvidia on Linux drawing more power than Windows


I have been monitoring the power draw on my NVIDIA GeForce GTX 1660 card on a dual boot Windows/Linux system. I always felt that Linux ran hotter than Windows so I tried to do a more detailed test.

I tried to create identical test scenarios -

  1. My PC was powered off for 2 hours to cool it down.
  2. Turn on the PC and leave it on for 15 minutes. Don’t run any applications.
  3. Run nvidia-smi --query-gpu=timestamp,temperature.gpu,power.draw --format=csv,noheader > test-data.csv
  4. Ran this for 60 minutes
  5. Looked at the average temperature and power draw

The result was that Linux was running a degree hotter at 50 degrees C v/s Windows at 49 degrees

What was more surprising was that Windows was drawing about 15W of power v/w Linux drawing 17W of power

I ran the above test during non-idle tasks (watched a Youtube Video) and Linux consumed more power than Windows. I may be able attribute this to different rendering (Xorg v/s Win32), but I don’t understand why the idle power consumption would be so high. As on Linux, I have fewer processes running than on Windows.

Are there any settings that I can tweak to reduce the power consumption?

I opened a thread in 2020 with a lot of tests where I showed that the linux driver consumes much more than in windows.

nobody from nvidia replied to that thread and the reply from other users was that it is a long time problem.

Thank you @deivi83 for your response. Seems like nothing is being done about this. As you noted as well, during higher intensity tasks, the temperature and power consumption is significantly higher on Linux, which is annoying.

Have you got a high refresh rate monitor? If you do, it’s a known bug but I don’t think NVIDIA will ever solve it.

The second issue is that Windows drivers switch between power modes a lot faster which means on Linux whenever you have a spike in GPU activity it will remain in high performance mode longer, thus consuming more energy.

My only hope is that Open Source NVIDIA drivers will work better, at least you have access to the source code, so you’ll be able to configure them for your needs unlike the blob with no options in this regard.

I first talked about it in 2017 :-)

Actually during higher intensity tasks there should be no discernible difference as the only thing that matters in this case is configured TDP which must be the same unless you’ve OC’ed your GPU under Linux.

It’s only when you have alternating activity, i.e. periods of average/high consumption followed by idleness it’s when Windows drivers work a lot better.