On Optimus systems, Gtk’s new DPI auto detection seems to screw up. Not only that, but the nvidia driver Xorg DPI setting seems to behave differently from the xrandr way.
Setting the DPI through nvidia’s xorg config way results in the correct number in xdpyinfo. Blender believes it’s rendering at 72 DPI when it’s actually using the system’s DPI value, but that’s not too tragic.
Now take your average Optimus wrangling startup script:
xrandr --setprovideroutputsource modesetting NVIDIA-0 xrandr --auto
For me, this is /usr/share/sddm/scripts/Xsetup. After recently having kwin die and the restarted kwin session visibly use a different DPI, in addition to the Gtk applications messing up auto detection, I got curious.
So I removed this directive from my Xorg.conf:
# Inside the nvidia device section Option "DPI" "144 x 144"
And instead, settled for this:
xrandr --setprovideroutputsource modesetting NVIDIA-0 xrandr --auto xrandr --dpi 144x144
The sddm login screen is scaled wrong (way too small, it probably thinks it’s at 72 DPI), but the rest of the desktop is fine. Newly launched applications are also fine. Gtk DPI auto detection still fails miserably.
So I thought, maybe I should move it before the optimus wrangling.
xrandr --dpi 144x144 xrandr --setprovideroutputsource modesetting NVIDIA-0 xrandr --auto
Result: Everything is way too big. Like, “this isn’t 144 DPI” big. xdpyinfo | grep dots reports
resolution: 203x203 dots per inch
This is obviously wrong. But I set it to 144x144, didn’t I? Gtk’s auto detection is still broken, but now it’s just as wrong as the rest of my desktop.
So let’s try a different value.
xrandr --dpi 108x108 xrandr --setprovideroutputsource modesetting NVIDIA-0 xrandr --auto
I reboot, and am presented with the exact same result. xdpy also reports 203x203 dots per inch. Even though I set a clearly different value.
So I change the xrandr command again, to the following:
xrandr --dpi 100x100 xrandr --setprovideroutputsource modesetting NVIDIA-0 xrandr --auto
I am greeted by a tiny unreadable login screen. I look at the output of xdpyinfo, and discover that it now reports the following:
$ xdpyinfo | grep dots resolution: 102x102 dots per inch
That is quite obviously not what I set. Not far off, but quite different from what it reported at 108x108 DPI.
Could there be a reason behind this?
108 / 72 = 1.5 (-> rounds up to 2)
100 / 72 = 1.388… (-> rounds down to 1)
If I set the DPI before xrandr --auto, it appears to divide the DPI by 72 and round to the closest integer to use as a scaling factor, and then generates a DPI from there. Setting it after xrandr --auto seems to do the drink (except for the sddm login screen and the Gtk auto detection), but only the nvidia xorg config value seems to actually work properly for everything (apart from Gtk, which may just be unfixably broken because I know they use a scaling factor (boo) which is an integer (booooo)).
Conclusion: Please look into how Optimus interacts with DPI settings. It appears to cause some inconsistencies in different applications. Blender, for example, believes it is using 72 DPI when using the nvidia config way, even though it should be able to query the system’s correct DPI. Gtk appears to always just get it wrong, and the xrandr way seems to have rounding issues.