I’ve reading this slide, from a NVidia Developer, about X server scaling with GPU in applications that look for resolutions in Xrandr / EDID.
Nowadays with the NVidia driver we can set proper Desktop Scaling as is described in the slides.
The problem comes with any application (games mostly) that checks Xrandr (EDID provided) resolutions to offer video modes.
As I understand from the paper, Nvidia suggested to overcome this problem to be able to “give” xrandr a configuration file or something with the available scaled resolutions to present to the apps and provide a formula (like this xrandr–output DVI-I-1 --mode 1920x1200 --set Border 0,60 --scale 0.6666667x0.6666667) for the X server scaling itself when the apps ask for this scaled resolution.
And I wonder if there’s any way to make this work now.
Nvidia offered resolutions (scaled metamodes) = xrandr available resolutions = GPU scaling via xrandr --options
I’m not probably explaining perfectly myself so, sorry, but I think the problem is real and it seems like there’s a possible solution for it since 2013. I guess most the Nvidia part is already done with viewportin viewportout options, and xrandr already support this kind of scaling so, what can be done to achieve this?
Edit: this is a followup of this other topic I opened some time ago