Linux Driver Behaviour vs Windows Driver Behaviour

Hi,
When I set resolution 3840x4320 60Hz (1074MHz pixel clock), under Linux (using ARandR), the pixel values change slightly for a static image. It is not visually noticeable, but when I capture the RGB data on my FPGA board (Xilinx ZCU102), I can see the CRC of the data is changing.
Under Windows, the CRC remains the same using the same resolution.

Windows 10 64 bit driver: 440.97
Ubuntu 20.04 Linux driver: nvidia-driver-440 (proprietary, tested)

EDID I Use:
0x00, 0xFF, 0xFF, 0xFF, 0xFF, 0xFF, 0xFF, 0x00, 0x61, 0x98, 0x23, 0x01, 0x00, 0x00, 0x00, 0x00,
0x28, 0x1C, 0x01, 0x04, 0xB5, 0x3C, 0x22, 0x78, 0x26, 0x61, 0x50, 0xA6, 0x56, 0x50, 0xA0, 0x00,
0x0D, 0x50, 0x54, 0xA5, 0x6B, 0x80, 0xD1, 0xC0, 0x81, 0xC0, 0x81, 0x00, 0x81, 0x80, 0xA9, 0x00,
0xB3, 0x00, 0xD1, 0xFC, 0x01, 0x01, 0x04, 0x74, 0x00, 0x30, 0xF2, 0x70, 0x5A, 0x80, 0xB0, 0x58,
0x8A, 0x00, 0x54, 0x4F, 0x21, 0x00, 0x00, 0x1A, 0x4D, 0xD0, 0x00, 0xA0, 0xF0, 0x70, 0x3E, 0x80,
0x30, 0x20, 0x35, 0x00, 0x56, 0x50, 0x21, 0x00, 0x00, 0x1A, 0x00, 0x00, 0x00, 0xFD, 0x08, 0x1E,
0x3C, 0x32, 0x0F, 0x6C, 0x01, 0x0A, 0x20, 0x20, 0x20, 0x20, 0x20, 0x20, 0x00, 0x00, 0x00, 0xFC,
0x00, 0x58, 0x69, 0x6C, 0x69, 0x6E, 0x78, 0x20, 0x73, 0x69, 0x6E, 0x6B, 0x0A, 0x20, 0x01, 0xAE,
0x70, 0x12, 0x5A, 0x00, 0x00, 0x81, 0x00, 0x04, 0x23, 0x09, 0x03, 0x07, 0x03, 0x00, 0x50, 0x87,
0xA3, 0x01, 0x04, 0xFF, 0x0E, 0x9F, 0x00, 0x0F, 0x80, 0x0F, 0x00, 0xDF, 0x10, 0x9A, 0x00, 0x03,
0x00, 0x13, 0x00, 0x87, 0xA3, 0x01, 0x04, 0xFF, 0x0E, 0x9F, 0x00, 0x0F, 0x80, 0x0F, 0x00, 0xE1,
0x10, 0x98, 0x00, 0x03, 0x00, 0x13, 0x00, 0x87, 0xA3, 0x01, 0x08, 0xFF, 0x0E, 0x9F, 0x00, 0x0F,
0x80, 0x0F, 0x00, 0x74, 0x11, 0x05, 0x00, 0x03, 0x00, 0x00, 0x00, 0x87, 0xA3, 0x01, 0x08, 0xFF,
0x1D, 0x3F, 0x01, 0x2F, 0x80, 0x1F, 0x00, 0xDF, 0x10, 0x9A, 0x00, 0x02, 0x00, 0x04, 0x00, 0x60,
0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x90,

The HW setup is identical for Windows and Linux.

The linux driver supports temporal dithering on geforce type cards while the windows driver does not.
You can disable it in xorg.conf using Option "FlatPanelProperties"
https://download.nvidia.com/XFree86/Linux-x86_64/384.98/README/xconfigoptions.html

Hi,
I didn’t have an xorg.conf file in /etc/X11, so I used the nvidia-settings program to create one, then I added the “FlatPanelProperties” option to it. I restarted Linux, but I still see the behaviour.

My xorg.conf file:

nvidia-settings: X configuration file generated by nvidia-settings

nvidia-settings: version 440.64

Section “ServerLayout”
Identifier “Layout0”
Screen 0 “Screen0” 0 0
InputDevice “Keyboard0” “CoreKeyboard”
InputDevice “Mouse0” “CorePointer”
Option “Xinerama” “0”
EndSection

Section “Files”
EndSection

Section “Module”
Load “dbe”
Load “extmod”
Load “type1”
Load “freetype”
Load “glx”
EndSection

Section “InputDevice”
# generated from default
Identifier “Mouse0”
Driver “mouse”
Option “Protocol” “auto”
Option “Device” “/dev/psaux”
Option “Emulate3Buttons” “no”
Option “ZAxisMapping” “4 5”
EndSection

Section “InputDevice”
# generated from default
Identifier “Keyboard0”
Driver “kbd”
EndSection

Section “Monitor”
# HorizSync source: edid, VertRefresh source: edid
Identifier “Monitor0”
VendorName “Unknown”
ModelName “XLX Xilinx sink”
HorizSync 50.0 - 270.0
VertRefresh 30.0 - 60.0
Option “DPMS”
EndSection

Section “Device”
Identifier “Device0”
Driver “nvidia”
VendorName “NVIDIA Corporation”
BoardName “GeForce RTX 2080 Ti”
EndSection

Section “Screen”
Identifier “Screen0”
Device “Device0”
Monitor “Monitor0”
DefaultDepth 24
Option “Stereo” “0”
Option “nvidiaXineramaInfoOrder” “DFP-5”
Option “metamodes” “DP-0: 3840x4320 +1920+0, DP-4: nvidia-auto-select +0+0”
Option “SLI” “Off”
Option “MultiGPU” “Off”
Option “BaseMosaic” “off”
Option “FlatPanelProperties” “DP-0: Dithering = Disabled”
SubSection “Display”
Depth 24
EndSubSection
EndSection

Please attach the corresponding xorg log.

Here it is.

Xorg.0.log (47.7 KB)

Thank you.

I forgot to mention that my GFX card is NVidia RTX 2080Ti.

Looks correctly set. What does
nvidia-settings -q CurrentDithering
return?

It says the value is 1.

However, I just unplugged the DP cable, and re-plugged it back, and ran the same command, and then it says the value is 0. And I can see my CRC is constant.
Is there a way to force the Dithering option via command line?

I also found out if I just launch nvidia-settings with no parameters, so the GUI shows, and then I just exit, the CurrentDithering value goes back to 1.
i.e. nvidia-settings -q CurrentDithering returns a value of 1.

I have to unplug and replug the DP cable for CurrentDithering to be 0.

BTW, what is the use of the “Dithering option”?

Normally, using the option in xorg.conf should set the default dithering behaviour, during run-time, this can be set using nvidia-settings, e.g.
nvidia-settings -a Dithering=0
or
nvidia-settings -a DitheringMode=1
then
nvidia-settings -q CurrentDithering
nvidia-settings -q CurrentDitheringMode
should return the values set. I just tried it and there seems to be a bug in the driver so the value is ignored.
What do you mean with “what is the use of the “Dithering option”?”

You can also try setting it by the nvidia-settings GUI, GPU-0->DP-0->Controls.

What is the purpose of Dithering the incoming image?
Is it supposed to improve quality or something else?

To improve image quality/reduce banding e.g. when the display has only 6bpp.
https://www.youtube.com/watch?v=Suv4Li3bTEI

But my display is supposed to be 8bpc (24 bits per pixel).
Is there a reason to turn it on for 8bpc?

Maybe yes, maybe no. Don’t know what the base for decisions made by the default “auto” setting is.

nvidia-settings GUI, GPU-0->DP-0->Controls

is working.

Thank you!