We have our own jetson based board driving display port output. This in turn gets converted to LVDS output to drive a panel directly.
As part of our production board test we connect our board to a test fixture that allows us to look at the data on the LVDS lanes. We want to do a “walking one” / “walking zero” bit pattern to verify the integrity of each of the pixel bits on the LVDS output.
It appears that some gamma correction is being done somewhere, as we read the LVDS input on the test fixture and compare it to an expected value we see:
Bit 2 on link 0 failed: read 0x000005, expected 0x000004
Bit 2 on link 1 failed: read 0x000005, expected 0x000004
Bit 3 on link 0 failed: read 0x00000A, expected 0x000008
Bit 3 on link 1 failed: read 0x00000A, expected 0x000008
Bit 4 on link 0 failed: read 0x000014, expected 0x000010
Bit 4 on link 1 failed: read 0x000014, expected 0x000010
Bit 5 on link 0 failed: read 0x000028, expected 0x000020
Bit 5 on link 1 failed: read 0x000028, expected 0x000020
Bit 6 on link 0 failed: read 0x000051, expected 0x000040
Bit 6 on link 1 failed: read 0x000051, expected 0x000040
Bit 7 on link 0 failed: read 0x0000A3, expected 0x000080
Bit 7 on link 1 failed: read 0x0000A3, expected 0x000080
The output in /var/log/Xorg.0.log leads you to believe that there is no gamma correction being done:
[ 14836.992] (==) NVIDIA(0): Using gamma correction (1.0, 1.0, 1.0)
but the above results beg to differ.
Is there any way to fully disable gamma correction? I am open to customizing the kernel if necessary.