Screen flickering at 4k@60hz

As mentioned in the link, only by forging a custom edid. So not really.

Maybe one last straw would be to ignore edid completely by using
Option “IgnoreEDID” “true”
and keep the modevalidation line but remove the NoEDIDModes from the list.
Means, running it blindly and hope the custom modes work.

sorry, just checked, IgnoreEDID is outdated, it’s
Option “UseEDID” “false”
now.

I tried this and got “Invalid mode” message on a black screen of my monitor. Then I changed the cable connection from DP to hdmi on nvidia and have a login screen. I tried both

Modeline "3840x2160x30" 593.41 3840 4016 4104 4400 2160 2168 2178 2250 +HSync +Vsync
Modeline "3840x2160x60" 297 3840 4016 4104 4400 2160 2168 2178 2250 +HSync +Vsync

and

Modeline "3840x2160x60" 593.41 3840 4016 4104 4400 2160 2168 2178 2250 +HSync +Vsync
Modeline "3840x2160x30" 297 3840 4016 4104 4400 2160 2168 2178 2250 +HSync +Vsync

with the same result.
I’ve just found this resource https://github.com/akatrevorjay/edid-generator
Do you think it’s possible to generate a right edid for my case with it?
conf and logs are attached.

xorg.conf.tar.gz (996 Bytes)
Xorg.0.log.tar.gz (17.3 KB)
nvidia-bug-report.log.gz (254 KB)

I just realized, did you use an active DisplayPort->HDMI converter?
What gtx950 do you have exactly? (Brand)

Yes, I’m using Club 3d DP2hdmi converter http://www.club-3d.com/en/detail/2366/displayportt-1.2-to-hdmit-2.0-uhd-active-adapter/
I like it because with it the display seems to be a bit more responsive to mouse movement.
The graphic card is zotac ZT-90601-10L

LOL. You should have mentioned that earlier. That converter doesn’t work with the linux driver. Reason unknown.
[url]https://devtalk.nvidia.com/default/topic/1000526/linux/problem-with-4k-60hz-with-nvidia-linux-x86_64-375-39/1[/url]
Another mystery is why your hdmi port is only at max pixelclock 165MHz. The GTX 950 should have HDMI 2.0.

[url]http://old.zotac.com/en/products/graphics-cards/geforce-900-series/product/geforce-900-series/detail/geforce-gtx-950-zt-90601-10l/sort/starttime/order/DESC/amount/10/section/specifications.html[/url]
Says: HDMI 4k@60

Please delete the ignoreedid, useedid and modevalidation line and retry over hdmi.

That’s right it does have support for hdmi 2.0, no problem. The thing is that via club 3d adapter the mouse lag is slightly less. However, you are right, when I switched the cable from dp to hdmi everything got sorted out and I can run 4k@60hz, thank you. It is definitely strange why this adapter works with Window and macos and does not with linux.
However I still have a question. With this resolution, I see color distortion especially on black text on wight background. it looks like the system is using color subsampling 4:2:0, while in os x I don’t see such distortion, which shoud mean macos is using color subsampling 4:4:4. Is it possible to override the default color subsampling settings in xorg.conf or somewhere else?

Thank you.

Option "ColorSpace" "DFP-1: YCbCr444"

Do you have a new xorg log for me?

About mouse lag: is your TV switched to PC mode to reduce lag due to post processing? Maybe the converter is forcing this?

I was struggling the whole day trying to make the colorspace work but no luck. Attached the latest xorg.conf and log files

I set this mode as a default on my TV. It is active all the times.

xorg.conf.tar.gz (1 KB)
Xorg.0.log.tar.gz (24.6 KB)
nvidia-bug-report.log.gz (270 KB)

I’ve parsed the EDID and it says that at 4k@50/60 only YUV420 is supported. The same as the nvidia driver reports. So it can’t be forced.
On some TVs RGB has to be switched on to support 444 at 4k@60, something like ‘UHD color’ or the like. Still, it’s a mystery why the converter ignores that or gets different values and works in Windows and macOS.
Maybe it’s a cable problem, is it a ‘high speed’ cable? So that when gpu and tv are negotiating link speed they fall back to 1.4 speeds and sends a different EDID? And the converter just ignores that and forces 2.0 speeds?

The cable was bought as hdmi 2.0 cable. The brand is Linoya. And it perfectly works with macos and windows. As for hdmi2dp adapter, in fact it is not needed. Everything works fine without it under macos and windows. So, I think, both mac and windows disregard edid and use their own settings for graphics. At least in macos there is no way to make it work at 4k@60hz without switchresx. I mean out of the box this resolution does not work. However, being set up in switchresx it provides full colorspace at 4k@hz resolution. May be it is possible to make a custom edid for such a resolution and full colorspace as described here https://github.com/akatrevorjay/edid-generator?

Forging an edid would be the option left. The modelines appear in your Xorg.0.log.

Just taken a look at the switchrex output again, it is indeed faking the edid for the 4k@60Hz mode. It inserts an additional DisplayID Block.

May be it’s possible to make edid out of switchresx output? If you know how to do this, it would be great if you can guide me on this.

There you go, attached.
Howto: c/p the edid text to a file then convert it using xxd -r -p

edid.zip (435 Bytes)

I didn’t quite understand. What you sent me is zipped edid.bin. I can parse it and see what’s inside with edid-decode. However, I have no idea how to convert it to a text and then convert it using xxd