I searched the Internet for how to enable GPU scaling in NVIDIA Linux drivers. It seems the nvidia-settings program once had an option for this (the current Windows counterpart still has), but it was removed some years ago. So my question is how do I enable GPU scaling on Linux?
How can I get GPU scaling to work? - NVIDIA Developer Forums
Hey JGB123321, the link you posted contains more questons than answers, and the people there had strange problems some of which weren’t solved. Could you or anyone else please elaborate what exactly to do?
With sincere respect, P.K., I neither know nor care what GPU scaling is. I posted that link to aid you in furthering your own search for answers.
On a different yet more fundamental and relevant note, too many members of this forum choose not to include complete and accurate system specifications (model numbers are good) in their forum signatures (one may have to log out and then log back in to edit said signature) in favour of relying solely upon nVidia bug reports or the posting of no system specs. at all.
The downside of this approach is that it narrows the scope for conducting a more effective process-of-elimination search which forms the backbone of trouble-shooting many a PC problem.
Take a look at my forum signature. Every pre-emptively researched, constituent component of my PC rig is listed by model number–and that entire ensemble works correctly in all regards with none of the issues I see in abundance on this forum and on its Windows-centric flip-side, forums.geforce.com.
There’s two primary reasons for this:
I exhaustively researched each component (by searching: make & model #, problems) before buying anything (which is why I chose an ECC RAM-supporting member of the ‘obsolete’ AM3+ platform as it is the last and most powerful, commercially available PC platform that pre-dates the era of permanently baked-in, proprietary and out-of-band ‘remote management’ & DRM-enforcement technologies such as Trusted Computing, Management Engine, Active Management Technology, vPro, Anti-Theft Technology, Platform Security Processor and ARM TrustZone).
In planning my PC I focussed upon reliability through simplicity (no over-clocking, liquid cooling, multi-monitor setups, vanity windows, disco lights or superfluous complexities of any kind) thus ensuring that my rig would function effectively as an autodidact’s research tool–and not as a mere toy…
While video games have generally been associated with children, the average gamer is actually 35 years old. Male gamers comprise 56% of the total gaming population while female gamers make up 44%. Out of the total U.S. population, roughly 59% of Americans play video games, with 51% of households owning a dedicated game console…"[/i]
Updated October 21, 2015
How The Video Game Industry Works (AMD, AMZN) | Investopedia
With sincere respect, JGB123321, if you neither know nor care what GPU scaling is, why are you even in this thread?
If someone asks for help on the official forum for that hardware, don’t turn up in threads that you have no interest of helping in, and start bitching when people post back in confusion at your non-answer. Not to mention that your further tirade has absolutely nothing to do with the original question and doesn’t help in the slightest, considering it’s a question about a now-missing driver option and not someone’s choice of hardware, although it seems like you posted that simply to toot your own horn.
Anyway, in regards to the actual thread topic: @painkiller17 I believe we’re now supposed to use XOrg options “IncludeImplicitMetaModes” and “Metamodes” to handle this instead, but I’ve yet to find a configuration that works (which is why I was searching and found this thread in the first place). The closest line I’ve found so-far which XOrg and the driver don’t complain about is…
Option “IncludeImplicitMetaModes” “DisplayDevice = DFP-3, Scaling=Aspect-Scaled, UseModePool = false”
… but unfortunately the monitor is still switching, so clearly it’s not being handled internally.
The options you’re looking for are ViewPortIn and ViewPortOut, which are part of the MetaMode attribute string.
Scaling is enabled if the sizes of ViewPortIn and ViewPortOut don’t match. nvidia-settings has options to specify the scaling explicitly in the advanced display config tab.
In basic mode, there are also entries marked as “(scaled)” in the resolution drop-down. Those automatically configure ViewPortIn scaling for common screen sizes that aren’t natively supported by your monitor.
Is it possible to set “ViewPortIn” to “anything”? As in “*”? Basically I want the GPU to only ever output one specific resolution, but whatever source resolution it tries to use, whatever it is, should be scaled, it should never try and switch the mode on the display itself.
Unfortunately I can’t switch to advanced mode as the dialogue is too big…
It doesn’t even work at 720p, it’s still too big. I had to add a fake frame-buffer of 1080p to be able to see all of the window. That seems like a bit of an oversight.