GPUCurrentCoreVoltage/GPUOverVoltageOffset missing on Pascal and Turing cards


I have recently switched to Arch Linux as my main OS (I was using Windows 10 before) and am enjoying it quite a bit.

After figuring out workarounds and finding replacements for certain things, the only remaining issue for me is the disability to modify the GPU core voltage. This helped me immensely to iron out some instabilities when overclocking under Windows, even just with the relatively small range (iirc, up to +100mV) MSI Afterburner gives you.

Under Linux, using the “Coolbits” option in the Xorg configuration should also unlock the ability to adjust voltage via nvidia-settings CLI tool (see

The relevant attribute should then be either GPUCurrentCoreVoltage or GPUOverVoltageOffset (I’m thinking the former only to read and latter to adjust voltage?).

However, this doesn’t seem to work on both Pascal and Turing cards. I have tested this myself with my 1080 Ti using the latest driver (440.36) as well as various previous driver versions. Running something like nvidia-settings --query all | grep -i volt yields no results and even when manually looking through the list of attributes, it doesn’t seem to contain anything in regards to voltage reading or adjusting.

Searching on the web revealed a few mentions of this:

However, I couldn’t find any statement or information about why adjusting core voltage would no longer be possible (or got disabled) on Pascal and Turing cards.

Since it is possible under Windows and apparently also works under Linux when using pre-Pascal cards, this seems like a rather arbitrary limitation to me.

Especially with Proton and Linux gaming in general getting better and more popular, I would love to see this option being available for more recent cards. Enthusiasts (like myself) would surely appreciate the ability to reach the same overclocks on Linux as on Windows.

Edit: The thing I’m interested in is overvolting in particular. It looks like undervolting has never been possible on Linux with the official drivers.

Voltage tuning has never been possible/available under Linux with the official NVIDIA drivers. I also doubt it will ever be enabled.

Thank you for your input. Why do these two attributes exist at all then? Also, why would they have been removed with Pascal/Turing? I can see several statements of people having these attributes on older cards, but they are apparently gone altogether with Pascal/Turing.

Also, several sources that claim that it should be possible to adjust voltage, e.g.: (see the last example).

So I guess that’s either misinformation or only applies to non-official drivers (although stated nowhere)?

I’m terribly sorry and you seem to have been right all along: GPUOverVoltageOffset was available for pre-Pascal cards but I never used it perhaps because it could only be used for overclocking.

The options are still there though but strangely only in nvidia-settigns:

[root@localhost NVIDIA-Linux-x86_64-440.31]# grep -r GPUCurrentCoreVoltage .
Binary file ./nvidia-settings matches
[root@localhost NVIDIA-Linux-x86_64-440.31]# grep -r GPUOverVoltageOffset .
Binary file ./nvidia-settings matches

There are no other places where they are present, so it looks like they are essentially deprecated.

Undervolting was never possible. You probably confused that.

No need to be sorry, I probably should’ve stated overvolting more specifically.

Interesting. I would love to know why they were deprecated though and if there is any chance to get these options back. Sorry, should I maybe ask these questions elsewhere? This forum seemed like the right place, but it looks like only someone actually involved in the development of the Linux drivers could have an answer (and I don’t really know if they frequent here).

NVIDIA Linux drivers developers/engineers sometimes leave messages in this forum but it’s a sort of a holiday for us all. It happens quite intermittently, maybe each 2-3 weeks and then the topics they choose to reply to are quite random and always from the first page. So, if you expect to get a reply, make sure this topic never leaves the first page.

From observation over the years, anything CoolBits related will never get an official answer from nvidia devs. It’s there, sometimes works but if it doesn’t, you’re busted.