How to get monitor detected as 1280x800?

I have a small 7" HDMI monitor that I use for various times when lugging a desktop-size monitor around is not an option.
The native resolution of this monitor is 1280x800.
Unfortunately, the TX2 only detects this as 1280x720. This means that all text/desktop/images are slightly stretched (tall) and I get fewer lines of text than I want on the display.

Is the HDMI driver supposed to be able to detect 1280x800? How can I make it do this for this display?
Just mashing the display mode with xrandr doesn’t work, as I run into a bad arguments error when trying to actually apply it to the output.

X Error of failed request:  BadMatch (invalid parameter attributes)
  Major opcode of failed request:  140 (RANDR)
  Minor opcode of failed request:  18 (RRAddOutputMode)
  Serial number of failed request:  20
  Current serial number in output stream:  21

You may need the HorizSync and VertRefresh ranges added to xorg.conf

Explained a bit here:


and

I already used cvt and xrandr. It’s when doing that that I get the “BadMatch (invalid parameter attributes)” error.

Note that this monitor has run in 1280x800 on other computers before. It runs at 60 Hz (or 59.90, as it were.)

The monitor detection should work by reading edid from the monitor. I’m not familiar on l4t, but in ubuntu derivatives there’s package read-edid. Which let you to read and parse monitor edid(sudo get-edid|parse-edid).

What kind of modeline have you tried? If you have not tried use modeline with reduced blanking

~$ cvt 1280 800 60 -r
# 1280x800 59.91 Hz (CVT 1.02MA-R) hsync: 49.31 kHz; pclk: 71.00 MHz
Modeline "1280x800R"   71.00  1280 1328 1360 1440  800 803 809 823 +hsync -vsync

Thinking about this some more, the way to get it into 1280x800 mode on the previous computer (a Raspberry Pi) was to sledgehammer the graphics chip into 1280x800 mode by telling it “no, really, the EDID of this display is …”
As in, I think the default EDID output by the display does not include the native mode (!) and thus requires an override.
And, because xrandr asks the graphics driver, and the graphics driver consults the EDID, it will say “no such mode is available.”

So, what I need is a way to convince the NVIDIA graphics driver on the TX2 that this display really does support 1280x800x60 as a display mode.

If you look at the data from this you can paste it into http://www.edidreader.com and it will tell you what modes are currently selectable (as in the modes the monitor is telling about…it doesn’t mean the graphics driver will agree, but it should if there isn’t anything unusual):

sudo cat `find /sys -name 'edid'`

Yes, that data matches what xrandr dumps “is possible.”
Unfortunately, 1280x800 isn’t exposed as a valid display mode by the EDID block.
HOWEVER, I know that, if I can force the graphics card to actually select/output the 1280x800 display format, it will work, and will be native on the panel.
(Blame cheap Chinese engineering for this state of the world.)

So, how do I force the display driver into this mode, which is not listed in the EDID information?

See “man xrandr”. Basically, if the mode is listed, use the “–mode” option. To add a mode that doesn’t exist, “–newmode” and “–addmode”. The example in “man xrandr” shows this specific case.

I don’t know how many more times I can say this: I ALREADY TRIED THAT!!!
Quote my very first post in this thread:

The problem seems to be that the driver refuses to accept a xrandr modeline that doesn’t match the exposed edid modes.

I need help from some NVIDIA graphics driver person who knows how to make the driver accept a mode “on blind faith” even though the EDID information seems to indicate that it’s “not supported.”

To make it 100% clear what happens when using xrandr, here is a shell transcript:

nvidia@tegra-ubuntu:~$ cvt 1280 800 60
# 1280x800 59.81 Hz (CVT 1.02MA) hsync: 49.70 kHz; pclk: 83.50 MHz
Modeline "1280x800_60.00"   83.50  1280 1352 1480 1680  800 803 809 831 -hsync +vsync
nvidia@tegra-ubuntu:~$ xrandr --newmode "1280x800_60.00"   83.50  1280 1352 1480 1680  800 803 809 831 -hsync +vsync
nvidia@tegra-ubuntu:~$ xrandr --addmode HDMI-0 1280x800_60.00
X Error of failed request:  BadMatch (invalid parameter attributes)
  Major opcode of failed request:  140 (RANDR)
  Minor opcode of failed request:  18 (RRAddOutputMode)
  Serial number of failed request:  20
  Current serial number in output stream:  21
nvidia@tegra-ubuntu:~$ cvt -r 1280 800 60
# 1280x800 59.91 Hz (CVT 1.02MA-R) hsync: 49.31 kHz; pclk: 71.00 MHz
Modeline "1280x800R"   71.00  1280 1328 1360 1440  800 803 809 823 +hsync -vsync
nvidia@tegra-ubuntu:~$ xrandr --newmode "1280x800-R"   71.00  1280 1328 1360 1440  800 803 809 823 +hsync -vsync
nvidia@tegra-ubuntu:~$ xrandr --addmode HDMI-0 1280x800-R
X Error of failed request:  BadMatch (invalid parameter attributes)
  Major opcode of failed request:  140 (RANDR)
  Minor opcode of failed request:  18 (RRAddOutputMode)
  Serial number of failed request:  20
  Current serial number in output stream:  21
nvidia@tegra-ubuntu:~$

This is a list of the currently defined modes:

nvidia@tegra-ubuntu:~$ xrandr
Screen 0: minimum 8 x 8, current 1280 x 720, maximum 32767 x 32767
HDMI-0 connected primary 1280x720+0+0 (normal left inverted right x axis y axis) 700mm x 390mm
   1280x720      60.00*+  59.94    50.00
   1920x1080     60.00    59.95    50.00    61.25    60.05    51.04    50.04
   1440x900      84.85    74.99    59.89
   1440x576      52.08
   1440x480      62.69
   1280x1024     75.03
   1024x768      75.03    70.07    60.01
   800x600       75.00    72.19    60.32    56.25
   720x576       50.00
   720x480       59.94
   720x400       70.04
   640x480       75.00    72.81    59.94
  1280x800R (0x1bd) 71.000MHz +HSync -VSync
        h: width  1280 start 1328 end 1360 total 1440 skew    0 clock  49.31KHz
        v: height  800 start  803 end  809 total  823           clock  59.91Hz
  1280x800_60.00 (0x1c0) 83.500MHz -HSync +VSync
        h: width  1280 start 1352 end 1480 total 1680 skew    0 clock  49.70KHz
        v: height  800 start  803 end  809 total  831           clock  59.81Hz
  1280x800-R (0x1c1) 71.000MHz +HSync -VSync
        h: width  1280 start 1328 end 1360 total 1440 skew    0 clock  49.31KHz
        v: height  800 start  803 end  809 total  823           clock  59.91Hz

(Note I tried the “cvt -r” version twice in this session.)

The monitor is this one.

LOL

Your error looks the same as this one (same major/minor opcodes). He fixed with it xorg.conf:
https://ubuntuforums.org/showthread.php?t=2324211

Nvidia driver is quite strict on modevalidation. So yeah you probably has to ease up that in your xorg.conf. Check nvidia readme x config options.

You need at least to add the line Option “ModeValidation” “AllowNonEdidModes”(or “HDMI-0:AllowNonEdidModes” if you just wan’t to allow them to just for hdmi output), if you wan’t to use some non-edid modes.

Of course there’s that old command which where used ages a go Option UseEDID “FALSE”, then driver just try every vesa mode’s on modevalidation process. But if it does not find working mode, screen will be blank. So give driver the HorizSync and VertRefresh ranges if you know them, or add your own modelines to xorg.conf which driver could use.

That looks promising! Thanks for the pointer. I will try this and report back.

Nope, still doesn’t want to use a custom mode for this display.

Even though I have turned on AllowNonEdidModes, and a few other flags, I still get this:

Mode is rejected: Only modes from the NVIDIA X driver's
    predefined list and modes from the EDID are allowed
    Mode "1280x800_60.00" is invalid.

Note that I can tell that options are parsed, because I can turn on the extended mode validation debug info.

Error trying to apply the mode (still the same):

nvidia@tegra-ubuntu:~$ xrandr --addmode HDMI-0 "1280x800_60.00"
    X Error of failed request:  BadMatch (invalid parameter attributes)
    Major opcode of failed request:  140 (RANDR)
    Minor opcode of failed request:  18 (RRAddOutputMode)
    Serial number of failed request:  20
    Current serial number in output stream:  21

xorg.conf:

Section "Module"
        Disable     "dri"
        SubSection  "extmod"
            Option      "omit xfree86-dga"
        EndSubSection
    EndSection

    Section "Device"
        Identifier  "Tegra0"
        Driver      "nvidia"
        Option      "AllowEmptyInitialConfiguration" "TRUE"
        Option      "TripleBuffer" "FALSE"
        Option      "ModeDebug" "TRUE"
        Option      "UseEdidFreqs" "FALSE"
        Option      "ModeValidation" "AllowNonEdidModes,NoTotalSizeCheck,NoVertRefreshCheck,NoHorizSyncCheck,NoMaxSizeCheck,NoEdidMaxPClkCheck"
    EndSection

    Section "Monitor"
        Identifier  "HDMI-0"
        Option      "DPI" "140 x 140"
        Option      "UseEDIDDpi" "FALSE"
        Modeline    "1280x800_60.00"   83.50  1280 1352 1480 1680  800 803 809 831 -hsync +vsync
    EndSection

    Section "Monitor"
    Identifier   "DSI-0"
    Option       "Ignore"
    EndSection

Xorg.0.log:

[   295.667] (--) NVIDIA(GPU-0): RTK 32V3H-H6A (DFP-0): connected
    [   295.667] (--) NVIDIA(GPU-0): RTK 32V3H-H6A (DFP-0): External TMDS
    [   295.667] (--) NVIDIA(GPU-0): RTK 32V3H-H6A (DFP-0) Name Aliases:
    [   295.667] (--) NVIDIA(GPU-0):   DFP
    [   295.667] (--) NVIDIA(GPU-0):   DFP-0
    [   295.667] (--) NVIDIA(GPU-0):   DPY-0
    [   295.667] (--) NVIDIA(GPU-0):   HDMI-0
    [   295.667] (--) NVIDIA(GPU-0):   DPY-EDID-7126c6dd-de8a-f8da-d931-2fc5e1ea3d49
    [   295.667] (--) NVIDIA(GPU-0):   HDMI-0
    [   295.832] (II) NVIDIA(0): Setting mode "HDMI-0: nvidia-auto-select @1280x720 +0+0 {ViewPortIn=1280x720, ViewPortOut=1280x720+0+0}"
    [   296.359] (--) NVIDIA(GPU-0): RTK 32V3H-H6A (DFP-0): connected
    [   296.359] (--) NVIDIA(GPU-0): RTK 32V3H-H6A (DFP-0): External TMDS
    [   296.359] (--) NVIDIA(GPU-0): RTK 32V3H-H6A (DFP-0) Name Aliases:
    [   296.359] (--) NVIDIA(GPU-0):   DFP
    [   296.359] (--) NVIDIA(GPU-0):   DFP-0
    [   296.359] (--) NVIDIA(GPU-0):   DPY-0
    [   296.359] (--) NVIDIA(GPU-0):   HDMI-0
    [   296.359] (--) NVIDIA(GPU-0):   DPY-EDID-7126c6dd-de8a-f8da-d931-2fc5e1ea3d49
    [   296.359] (--) NVIDIA(GPU-0):   HDMI-0
    [   296.441] (--) NVIDIA(GPU-0): RTK 32V3H-H6A (DFP-0): connected
    [   296.441] (--) NVIDIA(GPU-0): RTK 32V3H-H6A (DFP-0): External TMDS
    [   296.441] (--) NVIDIA(GPU-0): RTK 32V3H-H6A (DFP-0) Name Aliases:
    [   296.441] (--) NVIDIA(GPU-0):   DFP
    [   296.441] (--) NVIDIA(GPU-0):   DFP-0
    [   296.441] (--) NVIDIA(GPU-0):   DPY-0
    [   296.441] (--) NVIDIA(GPU-0):   HDMI-0
    [   296.441] (--) NVIDIA(GPU-0):   DPY-EDID-7126c6dd-de8a-f8da-d931-2fc5e1ea3d49
    [   296.441] (--) NVIDIA(GPU-0):   HDMI-0
    [   350.188] (WW) NVIDIA(GPU-0):   Validating Mode "1280x800_60.00":
    [   350.188] (WW) NVIDIA(GPU-0):     Mode Sources: User Specified, RandR Specified
    [   350.188] (WW) NVIDIA(GPU-0):     1280 x 800 @ 60 Hz
    [   350.188] (WW) NVIDIA(GPU-0):       Pixel Clock      : 83.50 MHz
    [   350.188] (WW) NVIDIA(GPU-0):       HRes, HSyncStart : 1280, 1352
    [   350.188] (WW) NVIDIA(GPU-0):       HSyncEnd, HTotal : 1480, 1680
    [   350.188] (WW) NVIDIA(GPU-0):       VRes, VSyncStart :  800,  803
    [   350.188] (WW) NVIDIA(GPU-0):       VSyncEnd, VTotal :  809,  831
    [   350.188] (WW) NVIDIA(GPU-0):       H/V Polarity     : -/+
    [   350.188] (WW) NVIDIA(GPU-0):     Mode is rejected: Only modes from the NVIDIA X driver's
    [   350.188] (WW) NVIDIA(GPU-0):     predefined list and modes from the EDID are allowed
    [   350.188] (WW) NVIDIA(GPU-0):     Mode "1280x800_60.00" is invalid.
    [   350.188] (WW) NVIDIA(GPU-0):

Can you get the xorg.log with verbose set to --logverbose 6. I think with that option, it actually displays the configuration options it parses. Might not find the problem but would at least say whether it is accepting the edid options. As a side note, get-edid doesn’t return anything on my TX2.

I didn’t get more information about the mode I’m trying to define, BUT:

startx -- -verbose 6 -logverbose 6 showed me something else that seemed important:

[  9637.544] (WW) NVIDIA(GPU-0): Unrecognized ModeValidation token "AllowNonEdidModes";
[  9637.544] (WW) NVIDIA(GPU-0):     ignoring.
[  9637.544] (WW) NVIDIA(GPU-0): Unrecognized ModeValidation token "NoTotalSizeCheck";
[  9637.544] (WW) NVIDIA(GPU-0):     ignoring.
[  9637.544] (WW) NVIDIA(GPU-0): Unrecognized ModeValidation token "NoVertRefreshCheck";
[  9637.544] (WW) NVIDIA(GPU-0):     ignoring.
[  9637.544] (WW) NVIDIA(GPU-0): Unrecognized ModeValidation token "NoHorizSyncCheck";
[  9637.544] (WW) NVIDIA(GPU-0):     ignoring.
[  9637.544] (WW) NVIDIA(GPU-0): Unrecognized ModeValidation token "NoEdidMaxPClkCheck";
[  9637.545] (WW) NVIDIA(GPU-0):     ignoring.
[  9637.545] (**) NVIDIA(GPU-0): Mode Validation Overrides for RTK 32V3H-H6A (DFP-0):
[  9637.545] (**) NVIDIA(GPU-0):     NoMaxSizeCheck

So, is this driver built without the ability to override the EDID?
Or did they rename these flags compared to what the documentation says?

Bump – it seems that the NVIDIA driver for Tegra is ignoring the AllowNonEdidModes ModeValidation option?
How can I force the driver into the mode I know the display will work with even though the display EDID is inaccurate?

It looks like those options (AllowNonEdidModes, etc.) are still the option names as they are present in the string table of the nvidia driver itself but for some reason they are being ignored. Given they are being ignored by the driver, two workarounds might be:

  1. is the “UseEDID” “FALSE” option unrecognized as well? If not, that could at least stop it trying to use the Edid and maybe use the custom mode. Does that option work or give the same unrecognized error?

  2. create your own modified edid and tell it to use that. Something like this:
    http://kodi.wiki/view/Creating_and_using_edid.bin_via_xorg.conf

Your xorg conf looks like it should work and those options should be recognized.

I tried creating a new edid for 1280x800, but the file generated seems invalid.
As in, edid-decode complains about it.
Looking at the script, it’s a total shell script hack that tries to force the assembler and objcopy to do the right thing, and my guess is, it doesn’t on modern Ubuntu.

Try putting the EDID data into http://www.edid.com and see if the checksum is valid. If it is, then it is other parts of the data which are incorrect. If not valid, then it may just mean you need to update the checksum.

Can you post your hex EDID? (output from reply #6 in this thread)