Driver reporting wrong refresh rate when using TwinView


I’m concerned by a bug already reported 9 years ago here:

Unfortunately, the mentioned workaround (1st) isn’t working anymore, because the mentioned X option has been removed in 331.xx drivers.
So, GNOME Shell is running at 50Hz and some games depending on VSync are locked to 50fps instead of 144 (Note that I have 3 monitors, 1 of them is 144Hz and the 2 others are 60Hz). The monitors are running at the Hz I set them to, and XFCE (XFWM) is working great since it doesn’t rely on NVIDIA’s refresh rate detection, unlike Compiz/Mutter.
When I unplug the 2 others monitors (and consequently disable TwinView), everything runs at 144Hz, as expected.

What to do about that?
nvidia-bug-report.log.gz (317 KB)

2 weeks after – no answer.
What’s going on, NVIDIA?! This is a major issue reported for almost 10 years, will it stay for 10 more?!

Wow, that didn’t come to my attention for some reason, I apologize for that.

I attached the generated file to the original post.

EDIT: Not sure if it worked, here’s a link to it:


It looks like the driver is reporting the correct refresh rate information via RandR 1.2, although I’m only seeing two monitors. Abridged:

DVI-I-1 connected 1600x900+2560+0 (0x29b) normal (normal left inverted right x axis y axis) 443mm x 249mm
  1600x900 (0x29b) 118.250MHz -HSync +VSync *current +preferred
        h: width  1600 start 1688 end 1856 total 2112 skew    0 clock  55.99KHz
        v: height  900 start  903 end  908 total  934           clock  <b>59.95Hz</b>
DP-0 connected primary 2560x1440+0+0 (0x1c3) normal (normal left inverted right x axis y axis) 598mm x 336mm
  2560x1440 (0x1c3) 538.760MHz +HSync +VSync *current
        h: width  2560 start 2564 end 2580 total 2582 skew    0 clock 208.66KHz
        v: height 1440 start 1441 end 1442 total 1449           clock <b>144.00Hz</b>

If gnome-shell is getting a refresh rate of 50, then it must be reading it via RandR 1.0. What version of gnome-shell are you using? I’d be surprised if modern versions were using RandR 1.0 queries rather than RandR 1.2.


Yes, there was only 2 monitors plugged at this time, for testing purposes. But it works the same with 2, 3, or 4. The issue doesn’t happen with only one.

I’m using GNOME Shell 3.20.4, on Arch Linux. Not sure if that helps, but here’s the output of xrandr --version:

xrandr program version       1.5.0
Server reports RandR version 1.5

I tried to upgrade to GNOME Shell 3.22, which is currently in testing on Arch, and it didn’t turn out good (couldn’t get it to work). Rolled back for now.
Will investigate on how GNOME guesses the refresh rate (or mutter, its compositor). Thanks.

Spent some time searching and it seems Mutter uses RandR 1.5 requests when possible.

Otherwise, could requests done with RandR 1.0 just return the highest refresh rate of all displays? Having it at the highest refresh rate makes more sense than having it at 50.