Reference to Xorg.conf and Resolution Settings

Using ‘“moreVerboseModeValidation” “true”’ I see this:

[    10.853] (WW) NVIDIA(0): Option "moreVerboseModeValidation" is not used

No logging seems to occur from this…perhaps it doesn’t apply to iGPUs.

I am also facing problems since I use a cheap HDMI monitor, so I am also looking into this topic and I’ll bring my 3 cents contribution.

Some unusual things about my case:

  • I’m using a TX2 as received flashed with R28.1.
  • I’m running a R28.2-DP2 (pre-release) on a SATA SSD:
head -n 1 /etc/nv_tegra_release 
# R28 (release), REVISION: 2.0, GCID: 10136452, BOARD: t186ref, EABI: aarch64, DATE: Fri Dec  1 14:20:33 UTC 2017

This is done through extlinux.conf, with R28.2 linux kernel image in R28.1 eMMC /boot directory.

  • I’ve been ok with this config so far, it is up-to-date for apt, and for some reasons I’d like to continue with it.

My monitor EDID:

sudo cat /sys/kernel/debug/tegradc.0/edid
 00 ff ff ff ff ff ff 00 4c 2d 7a 06 00 00 00 00
 32 13 01 03 80 10 09 78 0a ee 91 a3 54 4c 99 26
 0f 50 54 bd ee 00 01 01 01 01 01 01 01 01 01 01
 01 01 01 01 01 01 66 21 50 b0 51 00 1b 30 40 70
 36 00 a0 5a 00 00 00 1e 01 1d 00 72 51 d0 1e 20
 6e 28 55 00 a0 5a 00 00 00 1e 00 00 00 fd 00 18
 4b 1a 44 17 00 0a 20 20 20 20 20 20 00 00 00 fc
 00 53 41 4d 53 55 4e 47 0a 20 20 20 20 20 01 43
 02 03 23 f1 4b 84 13 05 14 03 12 10 1f 20 21 22
 23 09 07 07 83 01 00 00 e2 00 0f 67 03 0c 00 10
 00 b8 2d 01 1d 00 bc 52 d0 1e 20 b8 28 55 40 a0
 5a 00 00 00 1e 01 1d 80 18 71 1c 16 20 58 2c 25
 00 a0 5a 00 00 00 9e 01 1d 80 d0 72 1c 16 20 10
 2c 25 80 a0 5a 00 00 00 9e 8c 0a d0 8a 20 e0 2d
 10 10 3e 96 00 a0 5a 00 00 00 18 00 00 00 00 00
 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 dc

Using edidreader.com, I see serial number 0, so I am unsure if it is some original Samsung or counterfeited.

The Problem:
While it was working fine with previous monitor or with a HDMI TV, with this monitor I am seeing some problems.
With R28.1, it runs fine with resolution 1360x768.
But with R28.2-DP2 (using same default t186ref xorg.conf), here is what happens:
0- Virtual console starts with 768p.
When Ubuntu starts it gets:
1- 1080p with correct colors (a bit dark, though) for about 1 second.
2- Then it switches to another 1080p with flashy (pink) colors where I can log in, but the leftmost, topmost, rightmost and bottommost margin are wrong and I can’t see the whole screen.
3 - When Ubuntu starts my session, with 1080p I still get flashy colors and I can’t see the menu bar, and I can’t see the left half of icons in left bar.
3 bis - If I select 1360x768 in Ubuntu display settings, then when Ubuntu starts my user session the 1360x768 mode works fine with colors ok. However, for some applications such as Firefox browsing to Jetson forum, I have to resize to 80 or 90% in order to see everything, so I suppose something is confused somewhere.

So it not very harmful as I’m able to use it, but since R28.2 showed my monitor was able to perform 1080p, I have tried to see if I was able to adjust margins with modelines. I have seen that NVIDIA X server rejects any modeline from user through Xorg.conf and from builtin X server modes. In order to save many useless lines in Xorg.0.log, I’ve tried to use option ModeValidation set to NoXServerModes, it is accepted but seems similar to its current behavior…modes are probed but not validated.

The only way I’ve found to have some influence is using metamodes, but only when using some metamodes logged by NVIDIA X server. Otherwise it is silently ignored and it falls back to nvidia-auto-select.

If I specify metamode 1360x768, it seems to me it is working, but as soon as Ubuntu login screen appears, it seems there is a kind of reset or switch, it then sets again nvidia-auto-select 1080p flashy mode. Once I’m logged, it sets mode according to my Ubuntu settings.

So here are some questions:

  1. Is there a document that describes this X server/Ubuntu starting process ? Or may you give some explanations ?

  2. Is there a document that describes available options in X config ?

  3. Is there a chance I can use 1080p with this monitor ? Is there a way to adjust margins ? Is there a way to adjust colors ? I have to mention that I see varying high DPIs (215, 216 or 302, 302), but disabling DPI from EDID and falling back to 75 dpi didn’t changed.

  4. Is a custom EDID file option supported ?

Attached these logs:
Xorg.R28.1.log is the log with R28.1, and standard xorg.conf, just added Option ModeDebug.
Xorg.R28.2.log is the log with R28.2, and standard xorg.conf, just added Option ModeDebug.
Xorg.R28.2.Custom.log is the log with R28.2, and with the attached xorg.conf requesting 1360x768 metamode.

Thanks for any additional info.
Xorg.R28.1.log (160 KB)
Xorg.R28.2.log (159 KB)
xorg.conf.custom.txt (1.14 KB)
Xorg.R28.2.Custom.log (158 KB)

Additional info: Seems screen dimension is wrong (160mm x 90mm, but is about double in fact) as seen from xrandr:

xrandr --props
Screen 0: minimum 8 x 8, current 1360 x 768, maximum 32767 x 32767
HDMI-0 connected primary 1360x768+0+0 (normal left inverted right x axis y axis) <b>160mm x 90mm</b>
	TegraOverlayBlendmode: Opaque 
		supported: Opaque, SourceAlphaBlend, PremultSourceAlphaBlend
	TegraOverlayPriority: 255 
		range: (0, 255)
	EDID: 
		00ffffffffffff004c2d7a0600000000
		32130103801009780aee91a3544c9926
		0f5054bdee0001010101010101010101
		010101010101662150b051001b304070
		3600a05a0000001e011d007251d01e20
		6e285500a05a0000001e000000fd0018
		4b1a4417000a202020202020000000fc
		0053414d53554e470a20202020200143
		020323f14b841305140312101f202122
		2309070783010000e2000f67030c0010
		00b82d011d00bc52d01e20b8285540a0
		5a0000001e011d8018711c1620582c25
		00a05a0000009e011d80d0721c162010
		2c2580a05a0000009e8c0ad08a20e02d
		10103e9600a05a000000180000000000
		000000000000000000000000000000dc
	BorderDimensions: 4 
		supported: 4
	Border: 0 0 0 0 
		range: (0, 65535)
	SignalFormat: TMDS 
		supported: TMDS
	ConnectorType: HDMI 
   1920x1080     60.00 +  59.95    50.00    30.00    29.97    25.00    24.00    23.98  
   1360x768      60.02* 
   1280x720      60.00    59.94    50.00  
   1024x768      75.03    70.07    60.01  
   832x624       75.05  
   800x600       75.00    72.19    60.32  
   720x576       50.00  
   720x480       59.94  
   720x400       70.04  
   640x480       75.00    72.81    67.06    59.94

This might be related to above mentioned high DPIs.

Hello everyone,

We want to set 720x576 resolution instead of querying it from monitor for hdmi out. can we do that by editing xorg.conf?

Do we have to load edid data as well to set this resolution?

Thanks and regards,

I can’t give you an answer. What I can tell you is that if the mode is not available via EDID, then you can’t use that mode. If the mode is not within the predefined mode list, then the mode also cannot be used. If your monitor does not have EDID, then there may be ways of programmatically doing this through a modified kernel, but someone else will need to give those details. If you do have EDID, then hopefully this information will be clarified.

Sorry for late reply. I was out of office for weeks. Still checking if we could provide a list for Xorg options on forum.

As for custom mode and custom edid, the answer is NO. It is definitely not working in L4T xorg.conf.

To use custom edid, you need to add it in dts or kernel driver.

I am very curious about EDID and device tree relationship (I have never considered the device tree as a method of picking a mode). I personally consider “custom” to be a mode not available as a predefined mode; I consider picking of a mode within the allowed and predefined modes to simply be a manual pick of a mode which is predefined. In other words, I have been considering modifying the mode pool as custom, and picking an entry within the mode pool as standard and non-custom.

Is there a method via device tree to influence what nvidia-autoselect will pick from a mode pool when there are a large number of possible predefined modes? I’m still struggling to pick modes which I know the driver allows.

I am guessing 720x576 is not in the standard list of modes and is considered “custom” because it would not normally be part of a mode pool.

I think the reason behind “CustomEDID” not working is that it requires X to pass user defined edid from userspace to kernel, while our driver does not implement it.
This just avoids some potential problems for display initialization. As you know that the boot up procedure of tegradc is complicated. Need to consider all usecase if want to add this to X driver.

That also indicates why EDID in device tree can work well since it does not need to copy edid from userspace. Tegradc can just use it when display init.

Is there any possibility the “edid” file in “/sys” could be made writable? If this file were writable, perhaps we could create a custom EDID with just the mode we want. This would in no way cause the driver to allow modes not already in the mode pool, but we really need a way to configure the resolution when the resolution is valid.

In the case of device tree, what is the entry, and what goes there? Is it just the modeline in the same format as what xorg.conf would use?

For device tree solution, it seems there is a script in R28.2 located in kernel directory and named nv-enable-hard-coded-kernel-boot-display-mode.sh with a workaround for fbconsole pixel clock calculation issue. This is described in the R28.2 doc in kernelCustomization/DisplayConfigurationAndBringup/Hard-codingKernelDisplayBootModeForHDMI.

Hi WayneWWW/Honey_Patouceul,

Can you please tell me how do I set 720x576 resolution for hdmi out? we don’t have hdmi out exposed on tx2 interface board. we want to transfer hdmi output to hdmi receiver(adv7611) with this resolution.

what and where exactly I need to change in device tree?

Thanks and Regards,
Shivlal

I haven’t tried it yet, so be aware this is pure speculation, but from what I’ve seen you would just edit the script I’ve mentionned in /kernel, adapt to your settings, and running this script would patch the dtb (read the doc for details). Then flash the patched DT into Jetson and try.

shivlal12345,

Is your receiver providing an EDID to tegra? When you connect the cable to tegra, would tegra give out new kernel log?

Greetings,

This is to flag up that I’m having similar problems. The Nvidia driver rejects modes provided by the monitor’s EDID. What happens is that the requested pixel clock is up above 200MHz, so the driver thinks it’s invalid. I’m guessing this results from the fact that we’re using a DisplayPort to DVI-D converter (old monitor). If I force the driver to ignore pixel clock calculations (ModeValidation NoMaxPClkCheck) then Nvidia X server settings will let me choose the correct resolution … but the monitor just flickers uselessly.

Specs:
Ubuntu 18.04
Nvidia 390.XX driver , supported by Canonical
Quadro P4000
Dual monitor setup with separate X screens
Monitor 2 is Samsung SyncMaster XL30 connected via DP to DVI-D converter

If anyone can suggest the correct solution (if any) then please let me know.

Cheers,
JS

UPDATE - I upgraded to Nvidia driver 418 using the package from https://launchpad.net/~graphics-drivers/+archive/ubuntu/ppa but this did not resolve the issue. The Quadro P4000 is supposed to support 4 connected screens with 4096x2160 @ 120Hz each, so somehow I don’t believe that there’s actually a pixel clock issue. We should be able to run a monitor at 2560x1600 @ 60Hz, but the driver rejects this.
Xorg.1.log (287 KB)
xorg.conf.zip (1.8 KB)

Is it possible to use DVI monitor and use HMDI-DVI converter for jetson nano to automatically display the screen without manually modify the xorg.conf file?If so, what’s king of DVI? My DVI port of monitor has 28 pin.

Some DVI have EDID. Those should work with an HDMI-DVI converter, but beware that not all DVI (analog) provide EDID…those will fail. If the monitor and adapter support digital EDID you should be ok. DVI came out when transitioning from pure analog to digital, and so there was some backwards compatibility (DVI-D is pure digital, DVI-I is mixed, DVI-A is pure analog). There are some pictures of the DVI connector variations you might find useful here:
https://en.wikipedia.org/wiki/Digital_Visual_Interface

If you are testing, and if EDID is available, then you should find some data via:

sudo -s
cat `find /sys -name 'edid'`
exit

I want to purchase the cable before sure this is ok or no.
this is my monitor.

I won’t guarantee the monitor will work, but the monitor’s specification says it has a DVI-D connector (digital). This means that if and only if the EDID of that monitor is compatible with the modes the driver is able to work with, then an adapter for DVI-D to HDMI will work. Keep in mind that some adapters do not pass through EDID (adapters intended for analog DVI save money by cutting that wire).

I have a dual port HDMI cable and I want to purchase a HDMI-DVI convertor.

Is this ok? this don’t have 4 pins in right, it’s a DVI-D. but in my DVI port of monitor, have also 4 right pins.

Screenshot from 2020-06-05 00-05-05 ,

this is my monitor port. Is the above converter ok for my monitor?

Dual link will contain both analog and digital. It might work, and likely it would, but there is no guarantee (I say “likely” because dual link doesn’t really have a purpose if digital is not available, it just does not mean I can guarantee it passes through correctly). My personal thought is that it is worth testing.