Pixelated screen at 1920x1080, unable to add other resolutions

Dear developers, I have a GTX 1650 card installed on a Debian 10 machine with version 460.39 of the Nvidia driver. It’s connected via an HDMI cable to a Samsung T27A550 TV which has 1920x1080 resolution. The card is working fine at 1680x1050 however at 1920x1080 the screen becomes pixelated.

Before I installed the nvidia drivers I was using, if I’m not wrong, I was using the resolution of 1856x1392. Since installing the drivers this resolution no long appears as an option in the monitor control panel or nvidia-settings. I tried adding it with:

xrandr --newmode “1856x1392” 218.06 1856 1992 2192 2528 1392 1393 1396 1440 -HSync +Vsync
and then:
xrandr --addmode HDMI-0 1856x1392

but I get the following error:

X Error of failed request: BadMatch (invalid parameter attributes)

Also tried setting the ViewPortIn, ViewPortOut and Panning in the advanced Display config of nvidia-settings. ViewPortIn and Panning get set however ViewPortOut does not. It fixes the pixelation when applied however the screen is distorted.

I also tried various things with /etc/X11/xorg.conf, although I’m not even sure if the file is being used.

Can anyone please offer advice on how I can achieve the screen res of 1920x1080 or 1856x1392?

Thanks

nvidia-bug-report.log.gz (1.0 MB)

I suppose you should rather check the zoom/overscan options of the TV/monitor to get a just-scan picture. Otherwise, you could try to play with the ViewPortIn/ViewPortOut setting of nvidia-settings to find a meta-mode that works.

Thanks. Yes, I’ve tried all of the Samsung Menu->Screen Adjustment-Picture Size settings. eg. 16:9, 4:3, fit screen, etc. (if that’s what you mean), but it doesn’t help the pixelation at 1920x1080. I’ve tried to enter various resolutions from List of common resolutions - Wikipedia but so far i’ve not found anything suitable where I’m able to set all of the values for ViewPortIn, ViewPortOut and Panning. 1680×1050 works (with no pixelation) but I’d like a higher resolution (somewhere between 1680×1050 and 1920x1080).

I also checked with a second TV with HDMI input and the problems are the same. Neither has any overscan settings, although the second TV does have a “no scaling” setting. I would try older drivers but I need the lastest with CUDA 11 in order to use DaVinci Resolve.

I found an unexpected solution to the problem. It involved discovering a remedy to the “BadMatch” errors given by xrandr by overriding EDID. See NVIDIA/Troubleshooting - ArchWiki

With the change made to /etc/X11/xorg.conf (which I had created afresh using nvidia-settings) and after a reboot, I tested xrandr with this:

xrandr --newmode “1920x1080_new” 173.00 1920 2048 2248 2576 1080 1083 1088 1120 -hsync +vsync
and then
xrandr --addmode HDMI-0 “1920x1080_new”

When I opened nvidia-settings again I found that various new resolutions were present. 1920x1080 continues not to work however a new entry for 1920x1200 does. So problem solved. Strangely, 1680x1050, which used to work, no longer does, however 1680x900 (new) does and does not appear to be distorted in any way given the change in aspect ratio.

Based on your description, it sounds like the TV is broken and doesn’t present correct information in its EDID. That situation is pretty rare these days but setting the various EDID overrides to allow to you manually set the correct resolution does sound like the right workaround.

You may want to contact Samsung to see if there’s a way to update the TV’s firmware in order to get it to present correct EDID information to the computer.

1 Like