Temporal Dithering & Migraines

Thanks in advance for any help. I suffer from severe migraines caused by temporal dithering algorithms used on graphics cards. I have struggled for many years with this, since Macbooks have temporal dithering baked into the operating system and all AMD cards use them by default.

Using Windows desktop computers and laptops with Nvidia graphics cards for the last few years has been a good solution, since Nvidia’s drivers don’t enable dithering by default. However the new GTX 10 series seem to have dithering enabled and it’s causing me big problems.

When I disable the GTX 1050 in Device Manager I am able to use the laptop again without issue. Since this wasn’t an issue with the GTX 960M something must have changed between these versions in the drivers that enables some form of dithering.

Is anyone aware of methods for disabling the dithering algorithm used by the GTX 1050 in the registry or by other methods? There is a forum of people www.ledstrain.org with a similar issue that would deeply appreciate any suggestions or pointers. Any help is massively appreciated, I totally acknowledge that this is a strange request.

I second this request to disable temporal dithering that seems to be enabled on new Nvidia graphics cards. We replaced all of our workstations with new computers that all have Quadro P5000’s. We also use PCOIP and we now have serious issues with our workflow as PCOIP basically breaks down when temporal dithering is enabled. With various laptops not having 8-bit panels, this is a serious issue for us. It was never a problem before, as it was disabled on the older cards.

Nvidia please give us the option to disable this. These new Quadro cards have become a nightmare for our workflow.

I recommend filing a bug report with NVIDIA as soon as feasible. These forums are not designed as a bug reporting channel, just as a platform that enables users assisting other users.

The issue discussed here appears to have nothing to do with CUDA (= topic of this forum), but rather with graphics, so make sure to file the bug report through the appropriate website. I have never filed a bug for NVIDIA graphics, so I can’t point you.

What’s PCOIP, by the way? I have never come across this acronym.

Ok thanks for the points. I got here from searching for temporal dithering, I didn’t even look to see where I was. Sorry. PCOIP (PC over IP), is a protocol to remotely work from your computer (like remote desktop). It works by encoding everything into an h264 stream but with the dithering, the colours are always changing which degrades performance to nearly unusable.

I would also be interested in a way to disable dithering for the GTX 1070.

Appears to be baked in hardware with the 10xx series :( and I think it’s causing me issues.

The way to provide feedback to NVIDIA in actionable fashion is to file a feature request via a bug report. Forum posts on such matters generally accomplish nothing, because the relevant decision makers do not frequent these forums.

Apparently on Linux there is support to turn this driver feature off.

https://techsupport.teradici.com/link/portal/15134/15164/Article/232/How-do-I-turn-off-temporal-dithering-in-a-NVIDIA-graphics-card-15134-232

also explained here
http://us.download.nvidia.com/XFree86/Linux-x86/375.10/README/xconfigoptions.html

I do not know about registry options for Windows, sorry

Yes. I know. I run Linux.

It just seems with the 1070 that temporal dithering is on in hardware even when the driver turns it off. And in 99% of cases with 900 and previous card, there was no dithering on Windows.

So no one at NVIDA reads these forums? I should contact NVIDIA since it’s presumably a VBIOS problem?

I wouldn’t know which hardware and / or software components are responsible for implementing and configuring dithering, but I can tell you with certainty that it is not an issue with “CUDA Programming and Performance” which is what this forum is all about.

As far as I am aware, nobody at NVIDIA has “reading the forums” in their job description. If an NVIDIA employee reads these forums, they would do so out of their own interest. Because everybody has a specific skill set and area of responsibility, they would only on exceedingly rarely occasions happen to be the engineers who can directly address an issue brought up here. Sometimes someone from NVIDIA might file a bug report based on an issue brought up in the forums, with the consequence that they can access the bug status but the person from the forums cannot (for confidentiality, access to bug reports is restricted to the filer and relevant NVIDIA personnel).

So my standing recommendation is: Find the appropriate bug reporting form and report any issues through that designated support channel. There are people whose responsibility it is to verify and collate those reports, other people who have to prioritize the many bug reports and feature requests, finally the engineers who actually fix those issues that were deemed fix-worthy. As you can see, this is a pipeline, and depending on the priority assigned to an issue, the pipeline can be lengthy.

“but I can tell you with certainty that it is not an issue with “CUDA Programming and Performance” which is what this forum is all about”

I never put this thread here. I know it’s wrong. I’ve done CUDA programming in the past and was all “huh”.

Thanks for the advice.

Hi,

Did anyone resolve this? I bought a Quadro for the exact reason of avoiding inbuilt Intel dithering and now I have it with Nvidia quadro. How do I turn it off on Windows 10?

Not my area of expertise, I simply used a Google search. Have you had a look at this? The article has a Windows section.

https://help.teradici.com/s/article/1048

How do I turn off temporal dithering on a NVIDIA graphics card?

Thanks. It starts "There is no known method for turning of dithering on Nvidia GPU’s. " which seems bizarre

The way I understood the information on that page: NVIDIA GPUs do not use temporal dithering, despite various people yelling at NVIDIA for years to enable dithering as they observe color banding. However, dithering can occur if the monitor offers fewer bits per color channel than are provided by the NVIDIA GPU, in which case the dithering is actually due to the monitor.

As I said, not my area of expertise. Certainly not related to CUDA programing and performance, which is the topic of this subform.