Do recent geforce GPU support 10bit accuracy of LUT in 8bpc output mode via digital ports?

I did some tests about color curve correction with 8bpc panel lcd 5 years ago. when using geforce 9800gt (with newest driver back then), there was no grayscale lost on VGA port but much grayscale lost (ugly banding) on HDMI/DVI ports. while using my AMD hd3650 GPU showed great banding in all ports thanks to the 8->10bit dithering tec AMD used.(color curve itself accurate to 16bit from icc profile much higher than 10bit)

This is important to my decision of graphic card replacement next few days:

Do recent geforce graphic cards (such as gtx 970 or 1070 with newest driver in windows) support 10bit or higher accuracy of LUT (by dithering like the amd GPUs do) to keep grayscale banding smooth (while applying color curve or gamma correction) in 8bpc output mode via digital ports (DP\HDMI\DVI)?

thanks for helping !

Sadly not, on my 1070 it’s the same story with the banding artefacts.

NVidia seem unreachable about this issue. I have never seen a single comment from anyone at Nvidia in relation to calibration issues such as the bit depth issue and the other issue where the LUT is reset in full screen 3D applications.

However there are 2 things you can do to mitigate these issues:

  1. Get a monitor that has on board Gamma and White Balance controls, so that you can get the monitor as close to your calibration targets as possible. The closer you can get, the less banding there will be after calibration. Assuming the monitor itself has high precision dithering for its own calibration controls (some of the cheaper ones don’t).

  2. For full screen 3D applications, use ReShade to inject a shader containing your calibration. You can choose between either a 1D LUT or a 3D Lut …see dispcal forum for details. The precision should be higher than 8-bit for shaders, although I haven’t yet tested this out.

http://www.avsforum.com/forum/139-display-calibration/2084098-3d-luts-direct3d-opengl-applications-e-g-games-under-windows.html

With the above it should be possible to achieve automatic high precision LUT enforcement for full screen 3D applications.

For the desktop it will be limited to low precision.