Recent NVidia Cards Flickering Causing Health Issues

Hello,

I was advised by NVidia support to post this question in this section of the NVidia developer forums, although this problem does not impact Linux specifically and I fully realize this is not the ideal location of such a request. Please feel free to move to any other section. I am posting this as I am hoping this would catch attention of NVidia engineers and developers.

Since several years back NVidia graphics started to produce “unpleasant” output - the image is not stable and is exhibiting some form of flickering. This causes eye strain and feelings of nausea. In the past several years I have seen a number of different graphics cards on various systems and monitors and the problem is basically getting worse with every generation. I am not alone with this problem: a whole community that is capable of spotting the issue and is impacted by various symptoms due to this has emerged around the ledstrain.org forum - although the forum contains many other issues regarding other products as well, issues with NVidia cards get often repeated.
I myself am extremely sensitive to flicker and I - apart from many other people for whom the card’s output just produces health symptoms - can clearly see the instability. And it is not caused by monitors.

As - after spending dozens of hours studying the issue with various HW and SW - I am very much confident the problem is real, at this point I am even considering getting in touch with technology universities to further study the issue with lab equipment and making the issue public.

I am personally convinced the issue impacts millions of people that are only totally unaware of the source of their issues and are blaming general “computer fatigue”. The issue may also explain recent popularity of dark themes as they help hide any potential picture instability.

I would welcome any NVidia engineers to look into the issue, I am ready to provide any further details as required.

Please check nvidia-settings whether you have temporal dithering enabled and disable it, this is know to cause issues.

:) Hi, many thanks for the advice, but if it was that simple me and many other people wouldn’t be spending dozens of hours in desperate attempts to find the problems cause and solve it.

Of course it is disabled (at least in linux, in windows you have to edit registry for that), but this does not solve it (although with temporal dithering manually set, yes it is worse and it interestingly seems to dither even when the monitor has exactly the same color depth as your nvidia settings are, in which case it should not be dithering, but it’s just a side note).

However this problem goes way deeper. Existing theories include (but are not limited to):

  • some form of temporal dithering is still in effect although it is turned off to either improve performance or color fidelity
  • memory compression on the card that is not lossless and causes this as a side effect
  • picture handling in the card’s output buffer
  • some form of “incompatibility” between the card and the monitor in terms of video timings - my tests tell me very clearly that different video timing settings and algorithms (GTF/CVT/DMT/CVT reduced blanks) do have impact on picture stability situation, tested with many cards and monitors. Unfortunately no setting does really solve the issue usually, only may improve it to some extent - but the fact that these algorithms do have impact and they are not in general knowledge is very interesting in itself

My personal belief is that this may be deliberate by graphics card manufacturers, some new way of improving performance, they tested this on a sample of people that they were OK and able to endure this and so they went ahead and implemented it. Unfortunately there is not a small group of people out there that can’t stand it, it causes severe health symptoms, including nausea, headaches etc. As for the recent cards I have feelings that are not dissimilar to how 60Hz CRTs were back in the 90s. I would give anything to reduce performance of my card to, say, 30% only to get rid of this side effect. If you can’t look at the picture produced you no longer care about performance.

Unfortunately the fact is that manufacturers like NVidia occasionally get these reports, but they either drop them with “output from new cards does not differ from the older generations” (don’t tell me, I can SEE the difference on the same PC, same monitor, only different cards) or they simply ignore it (probably knowing what it is but wanting to keep it low to maintain the general view of great performance of new products). Which seems to be this case in this instance as well… :(

However, if I and other people from the ledstrain community are right, this may be a big case. Large manufacturers play with our health making decisions without public or healthcare authorities being allowed to have any say on this. I have hard time believing that even if most people can’t directly spot the problem, it has no effect on them in the long term.

Of course, I may be wrong and there is some easy explanation and a fix (deep down I actually hope there is), but so far my findings are hard to interpret in any different way.

https://forums.developer.nvidia.com/t/nvidia-2080ti-gfx-output-modifies-pixels/160030

Very interesting, thanks a lot for that. Unfortunately the conclusion is not 100% clear. Additionally I am getting issues with 16xx GTX cards as well (but they’re using Turing so this may be the reason).

gtx16 is basically the same as rtx20 minus raytracing. The link was merely meant as a lead, this has been observed on the technical level as well. IIRC, there’s also another post where the poster used drm dumb buffers to avoid interference from graphics libraries and noticed fluctuating pixel values as well.
The problem is there has been no real in-depth analysis so far and no usable data available.

Which version of the driver are you seeing problems with? There was an issue with the dithering configuration that was fixed in version 525.89.02:

2023-02-08 version 525.89.02

    * Fixed a bug that could cause banding or flickering of color
      gradients on DisplayPort monitors.

However, if that were the issue that’s affecting your monitor, manually disabling dithering should have stopped any flickering, although you would have seen banding in that case.

It is not specific to any version, it affects many systems and OSes and connections via HDMI, DP and DVI. It can even be seen while in BIOS or on the GRUB bootloader screen.

@aplattner Are you an NVidia employee? Is there a chance to get NVidia engineers’ statement on potential dithering or other similar pixel handling that could be the cause of such an issue and that would be implemented or changed throughout the past years? And can they deny the hypotheses I describe above?

Thank you very much.

The display pipeline is highly configurable so I don’t know offhand what it would be doing when the boot firmware first turns on the display, but once the driver is loaded and active, disabling dithering in the control panel should disable anything that would be causing pixel values to change on a frame-by-frame basis. When dithering is enabled, the “Dynamic 2x2” and “Temporal” options do try to interpolate color values using time-based dithering, but “Static 2x2” should only use spatial dithering.

What you write sounds very logical but this is not what is being observed. Picture instability remains even after turning dithering off or setting the static version of it. Again if the problem was that simple to solve you wouldn’t see this post here.

@aplattner Additionally the proof it must be some sort of dithering and that it could be intentional is that generally increasing refresh rate makes the problem better. As asked above, is there a chance to pass it up the nvidia hierarchy and get someone really knowledgeable reply? I am currently unable to use any nvidia products and I am not alone. I believe that having independent institutions study the problem is an unnecessary and extreme approach when there are probably people our there who might “just know”.

So unfortunately this seems to have the usual ending: the problem is either denied or ignored. How sad. It is, however, so widespread and obvious that it is hard to believe NVidia is entirely unaware.

Yeah i still have this problem on my RTX A4000, and i have a true 10bit (no FRC) monitor which requires zero dithering but the RTX A4000 is still dithering in the BIOS, i swap in a Quadro K4200 and the image is clean again.