Eyestrain: Linux display quality doesn't look as good as Win10 over HDMI 2.0. Chroma issue?

Hi Nvidia experts!

I have been struggling with Nvidia’s Linux drivers for quite some time (8 months or so). My main issue is that I could never get graphics to look “right” under Linux. By right, I mean the same as under Windows 10. I could never put my finger on it as to what was wrong but the text just didn’t look the same. This caused me a lot of eyestrain and I could never use my computer under Linux as I can under Windows. Even when I installed Windows fonts, the text was just not as good and I spent a lot of time messing with fontconfig etc. But fontconfig was definitely not the issue. I did figure out what was wrong but I still haven’t been able to fix it.

I’ll cut right to the chase: open this test PNG file in your default image viewer (or a web browser), make sure it’s set to 100% (no scaling) and look at the bottom two lines. If they’re crisp and readable, your Nvidia driver is performing fine. If the two lines are fuzzy, it could be that you don’t have a display that supports 4:4:4 chroma (YCbCr) mode or it could be that you’re running Nvidia drivers under Linux.

Test: http://i.rtings.com/images/test-materials/2017/chroma-444.png

I have 4K Samsung HDR display and it’s connected to my Nvidia GTX 1060 over HDMI 2.0 cable. I use Arch Linux and use (latest available) Nvidia binary drivers. Under Windows 10, that test pattern looks absolutely PERFECT on my screen. Text is sharp and bottom two lines are perfectly readable and look just right. But when I boot Linux and look at the image, test pattern looks almost like 4:2:2 Chroma subsampling… it resembles this image:

http://i.rtings.com/images/reviews/ju7100/ju7100-text-chroma-4k-30hz-large.jpg
These images come from here: http://www.rtings.com/tv/learn/chroma-subsampling

I have tried:

  1. Using a different cable. I’ve tried 3 different cables including certified 4K UHD cables. Issue persists.

  2. Plugging HDMI cable into different ports on my display. Only when I plugged it into Port 1 (also labeled DVI) did it work well under Windows, but Linux still didn’t look as good.

  3. I have tried all kinds of Xorg.conf paramerts and I just can’t get it to display colors the same way that Windows 10 does. The leads me to believe that the issue is not hardware but software.

  4. I’ve tried RGB mode, YCbCr mode, disabling dithering. I’ve tried enabling Force Composition Pipeline, Full Force Composition Pipeline… nothing helps.

  5. I have tried several Linux distributions. Same issue.

  6. I have tried different image viewers. Same issue.

  7. I have tried friend’s GTX 1070 card. Same issue.

  8. I have bought a professional monitor colorimeter and have calibrated the display under both Linux and Windows. The curves definitely do look a bit different. But the issue persists.

I have not been able to find the fix. I’m at loss as what could be wrong and I suspect only you, Nvidia folks, can help me fix this and make my Linux display the same as Windows one. I’m willing to give you my phone number and other contact information and I’m willing to run any diagnostics you may need so you fix this. I’d really like to use Linux all the time but this issue is preventing me from doing so.

nvidia-bug-report.log.gz is here: http://ge.tt/9LATwnm2

Thank you so much for reading!

Please put

Option  "ModeDebug"  "true"

in the Screen section of your xorg.conf, reboot and run nvidia-bug-report.sh again

NickJ01
it looks fine .
ubuntu mate 16.04 .latest nvidia driver . GT610 .

Hi generix, here’s the new log with mode debug turned on: http://ge.tt/1RyNHom2

Thank you!

Mounir, what kind of display are you running and how is your graphics card connected to it?

NickJ01
Samsung 40" LED TV at 1920x1080 50p connected with HDMI cable .

According to the logs, everything is fine. Right modes and colorspaces detected and RGB used.
Can you provide a photo where the effect is visible? Does the TV has a menu where it shows details about the mode used?

generix, thank you so much for looking at this! I truly appreciate it so I know I haven’t made any obvious mistakes.

I’ll try to take some pics when I get home later today but I can tell you that it looks almost exactly like this pic: http://i.rtings.com/images/reviews/ju7100/ju7100-text-chroma-4k-30hz-large.jpg bottom two lines are nearly identical.

I have a friend coming over with his brand new AMD card that he hasn’t even opened yet and we’ll see what the picture looks like. That will eliminate another possible issue… that there’s something wrong with the Linux kernel and Xorg itself. I’ll post results tonight.

Thanks again.

Btw, what does the nvidia-settings gui display for your TV?

Well, today we tried RX 480 in place of my 1060 and we booted Linux after installing catalyst drivers and we had to fiddle with xorg.conf to get it working well. After about 30 min of messing around, we got it working. There were no screen issues at all with it and the test pattern showed up perfectly and looked just like it does under Windows. That proves that my hardware is not at fault and my Linux install is also working well.

The culprit must be the Nvidia binary driver. Something it does is not the same as it does in Windows… or one of my settings is wrong.

generix, here’s the album of all nvidia-settings screens with actual information on them:

https://imgur.com/a/k4BW7

Please look it over and tell me if you notice anything strange/wrong.

Thank you so much! I really appreciate it.

BTW, settings on the TV are set to Game Mode (i.e. there’s no processing done by the display at all). The OS itself cannot change the settings.

After looking at the screen colorimeter outputs of ICC profiles, I’ve noticed that the Luminance is over 2x higher under Windows 10 than under Linux. I have no idea how that’s possible.

Windows: https://i.imgur.com/l9T4t5V.jpg

Linux: https://i.imgur.com/oKHW2jS.png

Just like the logs, everything in nvidia-settings looks fine. It’s a mystery.
You’re currently running the beta driver, did you at any point try the stable one, or downgrade generally?

No, I have not tried downgrading. I’d have to recompile the kernel to make everything compatible. I have 387.12 installed right now and I’m not sure which one to pick… 384.90 maybe since that’s what Nvidia currnetly has listed on the downloads page?

But it definitely is weird and I believe driver-related. Win10 looks fine, Radeon looks fine but only Nvidia under Linux doesn’t look as good.

Don’t get me wrong, the difference is probably not immediately perceptible to most of the people but it definitely is there. Maybe that’s why it hasn’t been detected by others by now?
And I think, at this point, only someone within Nvidia and their Unix team could troubleshoot this. I’m hoping someone from that team is reading this forum but I don’t have my hopes up.

I do not have a solution, but I want to report that the problem is real and that I can see it too. I can also confirm that I see that problem on all Nvidia cards that I have tried under Linux. Also, that problem is visible even on AMD vega64. But it works fine on Intel drivers(??) using same monitors. I can also confirm that everything works fine under windows. None of my friends feels the eye strain and I can see the problem immediately just by looking at the screen :(

Maybe this has something to do with dithering, did you try
Option “FlatPanelProperties” “Dithering=disabled”
in xorg.conf?
https://download.nvidia.com/XFree86/Linux-x86_64/384.98/README/xconfigoptions.html