Recent NVidia Cards Flickering Causing Health Issues

Hello,

I was advised by NVidia support to post this question in this section of the NVidia developer forums, although this problem does not impact Linux specifically and I fully realize this is not the ideal location of such a request. Please feel free to move to any other section. I am posting this as I am hoping this would catch attention of NVidia engineers and developers.

Since several years back NVidia graphics started to produce “unpleasant” output - the image is not stable and is exhibiting some form of flickering. This causes eye strain and feelings of nausea. In the past several years I have seen a number of different graphics cards on various systems and monitors and the problem is basically getting worse with every generation. I am not alone with this problem: a whole community that is capable of spotting the issue and is impacted by various symptoms due to this has emerged around the ledstrain.org forum - although the forum contains many other issues regarding other products as well, issues with NVidia cards get often repeated.
I myself am extremely sensitive to flicker and I - apart from many other people for whom the card’s output just produces health symptoms - can clearly see the instability. And it is not caused by monitors.

As - after spending dozens of hours studying the issue with various HW and SW - I am very much confident the problem is real, at this point I am even considering getting in touch with technology universities to further study the issue with lab equipment and making the issue public.

I am personally convinced the issue impacts millions of people that are only totally unaware of the source of their issues and are blaming general “computer fatigue”. The issue may also explain recent popularity of dark themes as they help hide any potential picture instability.

I would welcome any NVidia engineers to look into the issue, I am ready to provide any further details as required.

1 Like

Please check nvidia-settings whether you have temporal dithering enabled and disable it, this is know to cause issues.

:) Hi, many thanks for the advice, but if it was that simple me and many other people wouldn’t be spending dozens of hours in desperate attempts to find the problems cause and solve it.

Of course it is disabled (at least in linux, in windows you have to edit registry for that), but this does not solve it (although with temporal dithering manually set, yes it is worse and it interestingly seems to dither even when the monitor has exactly the same color depth as your nvidia settings are, in which case it should not be dithering, but it’s just a side note).

However this problem goes way deeper. Existing theories include (but are not limited to):

  • some form of temporal dithering is still in effect although it is turned off to either improve performance or color fidelity
  • memory compression on the card that is not lossless and causes this as a side effect
  • picture handling in the card’s output buffer
  • some form of “incompatibility” between the card and the monitor in terms of video timings - my tests tell me very clearly that different video timing settings and algorithms (GTF/CVT/DMT/CVT reduced blanks) do have impact on picture stability situation, tested with many cards and monitors. Unfortunately no setting does really solve the issue usually, only may improve it to some extent - but the fact that these algorithms do have impact and they are not in general knowledge is very interesting in itself

My personal belief is that this may be deliberate by graphics card manufacturers, some new way of improving performance, they tested this on a sample of people that they were OK and able to endure this and so they went ahead and implemented it. Unfortunately there is not a small group of people out there that can’t stand it, it causes severe health symptoms, including nausea, headaches etc. As for the recent cards I have feelings that are not dissimilar to how 60Hz CRTs were back in the 90s. I would give anything to reduce performance of my card to, say, 30% only to get rid of this side effect. If you can’t look at the picture produced you no longer care about performance.

Unfortunately the fact is that manufacturers like NVidia occasionally get these reports, but they either drop them with “output from new cards does not differ from the older generations” (don’t tell me, I can SEE the difference on the same PC, same monitor, only different cards) or they simply ignore it (probably knowing what it is but wanting to keep it low to maintain the general view of great performance of new products). Which seems to be this case in this instance as well… :(

However, if I and other people from the ledstrain community are right, this may be a big case. Large manufacturers play with our health making decisions without public or healthcare authorities being allowed to have any say on this. I have hard time believing that even if most people can’t directly spot the problem, it has no effect on them in the long term.

Of course, I may be wrong and there is some easy explanation and a fix (deep down I actually hope there is), but so far my findings are hard to interpret in any different way.

https://forums.developer.nvidia.com/t/nvidia-2080ti-gfx-output-modifies-pixels/160030

Very interesting, thanks a lot for that. Unfortunately the conclusion is not 100% clear. Additionally I am getting issues with 16xx GTX cards as well (but they’re using Turing so this may be the reason).

gtx16 is basically the same as rtx20 minus raytracing. The link was merely meant as a lead, this has been observed on the technical level as well. IIRC, there’s also another post where the poster used drm dumb buffers to avoid interference from graphics libraries and noticed fluctuating pixel values as well.
The problem is there has been no real in-depth analysis so far and no usable data available.

Which version of the driver are you seeing problems with? There was an issue with the dithering configuration that was fixed in version 525.89.02:

2023-02-08 version 525.89.02

    * Fixed a bug that could cause banding or flickering of color
      gradients on DisplayPort monitors.

However, if that were the issue that’s affecting your monitor, manually disabling dithering should have stopped any flickering, although you would have seen banding in that case.

It is not specific to any version, it affects many systems and OSes and connections via HDMI, DP and DVI. It can even be seen while in BIOS or on the GRUB bootloader screen.

@aplattner Are you an NVidia employee? Is there a chance to get NVidia engineers’ statement on potential dithering or other similar pixel handling that could be the cause of such an issue and that would be implemented or changed throughout the past years? And can they deny the hypotheses I describe above?

Thank you very much.

The display pipeline is highly configurable so I don’t know offhand what it would be doing when the boot firmware first turns on the display, but once the driver is loaded and active, disabling dithering in the control panel should disable anything that would be causing pixel values to change on a frame-by-frame basis. When dithering is enabled, the “Dynamic 2x2” and “Temporal” options do try to interpolate color values using time-based dithering, but “Static 2x2” should only use spatial dithering.

What you write sounds very logical but this is not what is being observed. Picture instability remains even after turning dithering off or setting the static version of it. Again if the problem was that simple to solve you wouldn’t see this post here.

@aplattner Additionally the proof it must be some sort of dithering and that it could be intentional is that generally increasing refresh rate makes the problem better. As asked above, is there a chance to pass it up the nvidia hierarchy and get someone really knowledgeable reply? I am currently unable to use any nvidia products and I am not alone. I believe that having independent institutions study the problem is an unnecessary and extreme approach when there are probably people our there who might “just know”.

1 Like

So unfortunately this seems to have the usual ending: the problem is either denied or ignored. How sad. It is, however, so widespread and obvious that it is hard to believe NVidia is entirely unaware.

Yeah i still have this problem on my RTX A4000, and i have a true 10bit (no FRC) monitor which requires zero dithering but the RTX A4000 is still dithering in the BIOS, i swap in a Quadro K4200 and the image is clean again.

I have the same hardware and having this exact issue. Nvidia please fix this.

I’m not sure if my problem is some kind of ‘flickering’ but it sounds similar to mine. Over the past few months I’ve tried 2 laptops containing an Intel 770 UHD and one had a Nvidia RTX 4060, the other one same Intel card but Nvidia RTX 4070. When working with the Intel card everything seems fine and I can work for multiple hours without any symptoms.

Once I set the laptop to only use the dedicated graphics card I start getting a strange feeling in the back of my head/migraine. First I thought it might need some getting used to but when switching back to Intel everything went fine again. So I believe something is really different when using the Nvidia ones. Personally I can’t really see a visual difference (or at least not without the ability of comparing the different cards next to eachother) but the Nvidia cards really get uncomfortable after using them for multiple hours.

I stumbled across this thread and others like it because it finally dawned on me that the source of my migraines were related to my gaming computer.

Long story short, I’ve been suffering from migraines for the past 2 years. The frequency changes.
I tried all sorts of things - quitting drinking alcohol for an extensive period of time, quitting coffee. I thought it was the scented candles my wife likes. I went back to the optometrist and got new glasses.
I came close to figuring out the problem - I bought a far better quality monitor (which cost a fortune)

None of it worked.

Recently, I got a really bad cold - close on 2 weeks of symptoms - during that time, I didn’t turn on my gaming rig once. The migraines went away.
The other day, I had to use my gaming rig for some Android phone backup stuff.
Within 30 minutes, I could feel the start of a migraine.

Guess what, 2 years ago - which correlates with how long I’ve been getting migraines - I bought a new video card, a Nvidia RTX 3060.

I’m highly reluctant to experiment with turning temporal dithering off, as the migraines can be so bad they can last 2 days. It makes it hard to work.

For now, I’m going to put my old GTX 970 back in my rig - and do some more research.

Perhaps moving to Radeon cards is the solution.

Also, I moved from Windows to Linux for gaming - so clearly the same issues exist on Linux too!

+1 to this issue, still in September 2024. I just bought a Lenovo Legion T5 tower with an RTX 3060 installed, and love everything about it except for the unbearable eye strain, which is so bad that I must return it. Like others in this thread, it happens on every screen (even BIOS) when the monitor is plugged in to the GPU (and this system has no other video out ports). I use the same monitor every day when working from home with no issues. I’ve tweaked every imaginable NVIDIA setting (refresh rate, color accuracy, dithering, vsync, gsync, and many more). Really does feel like an NVIDIA hardware issue. Please fix this.

I also have eyestrain issues when using Nvidia GPU but never iGPU.

I’ll feel spacey and will get migraines. Like others here I noticed I can work for hours using the Intel GPU but when I plug the monitor into my 2080ti things just felt different.

I recently bought a Lenovo Legion 7i Pro which gave me extremely headaches.

Using 8bit full seems to help but I’m also wondering if there is more going on here.