GTX 1060 375.26 not output 10-bit color


I have a GTX 1060 3Gb with 375.26 driver.

The card seem not outputting 10-bit color, although display depth is set to 30 in xorg.conf, and Xorg.0.log shows Depth 30, RGB weight 101010 for Nvidia.

Tested with some 10-bit test videos from internet, and also my TV should show a notification when it receives 10/12-bit signal (and currently it doesn’t show such notification).

What is needed to enable 10-bit output for 1060?


Please provide a link to authoritative evidence which confirms that a budget, consumer grade graphics card such as the GTX 1060 supports 10-bit color under GNU/Linux.

[i]"NVIDIA Geforce graphics cards have offered 10-bit per color out to a full screen Direct X surface since the Geforce 200 series GPUs. Due to the way most applications use traditional Windows API functions to create the application UI and viewport display, this method is not used for professional applications such as Adobe Premiere Pro and Adobe Photoshop. These programs use OpenGL 10-bit per color buffers which require an NVIDIA Quadro GPU with DisplayPort connector. A small number of monitors support 10-bit per color with Quadro graphics cards over DVI. For more information on NVIDIA professional line of Quadro GPUs, please visit:"[/i]

Updated 08/02/2011
10-bit per color support on NVIDIA Geforce GPUs | NVIDIA

Find Answers | NVIDIA


03 Feb 2016
RedShark News - 8-bit or 10-bit? The truth may surprise you


1st, I don’t quite understand why it’s me to provide such “a link to authoritative evidence”. Contrariwise, it’s me posting here exactly to get somehow these “authoritative evidences”.

2nd, I don’t care about Direct X surfaces, viewports and Adobe Premiere Pro. I’m talking of 10-bit videos (like HEVC Main10), on Linux, with proprietary Nvidia’s Linux drivers.

3rd, I happen to have a Quadro too (Fermi 600), and it doesn’t output 10-bit either (at least, over DVI-HDMI and Displayport-HDMI adapters)

4th, Pascal is marketed as having “PureVideo feature set H” (plenty of authoritative links can be found in internets). But as far as I can see, current drivers lack both 10-bit HW decoding (at least, with VDPAU), and 10-bit output (regardless being it video, viewport, or something else).
It quite may be that I simply misunderstand something, or just don’t know how to get these things working – and I would like to get an answer on how I can fix this. Otherwise, I will be forced to conclude that Nvidia’s marketing tells us (let’s put it as follows:) not the whole truth.

5th, if you don’t know what to say on topic – don’t bother answering.


PS. Don’t know if this link is authoritative enough or not

So, decoding 10-bit HEVC now works – with CUDA.
10-bit output still doesn’t, at least over HDMI.
That’s a shame that NVidia’s own driver in Linux fails to do what it does in Windows…

Even though it works in windows, Nvidia linux drivers don’t support 10-bit over HDMI as stated in README.

In regards to VDPAU output, it has support of RGB10A2 surfaces, at least with Maxwell.

Zotac GTX 1060-6G supports 10-bit colors for the DisplayPort connection. I verified that it supported 10-bit with Zotac support before buying the card. They did not say it is supported on/for any other connection.
I have monitor that supports 10-bit(8-bit +FRC) so I changed the “xorg.conf” to set Depth 30 and it was obvious after restart of X that it was applied.
However, display is not correct in KDE/Plasma5(kwin). At the least, the ‘glow’ that is normally around windows and on panels in Oxygen theme was turned into a flat black. …
That could be any number of factors including at least the compositor, OpenGL version in use(at the time OpenGL 2.0 was the default), kwin and/or the theme(Oxygen here), xorg 1.19, … and/or of course the way nVidia has written the driver(375.26) code(or perhaps even the way it was compiled).

I did run xwininfo and both Firefox and Krita had depth of 30 however the correct Colormap was not installed for either.

This is my first foray into 10-bit colors so I do not yet know all the parts for putting the engine together, so to speak. All I can say is use the DP and try again.

@fhj52 Thanks a lot for your input!
Could you please get EDID from your DP monitor and upload it somewhere?
This is how to get it with nvidia-settings

PS. Nevermind.
I finally could get it all up working, a sort of…

Glad it is working …“sort-of” at least.
Sorry it took so long but have not been monitoring the email address for this DevTalk account for the last week or so. …

While I am familiar with getting EDID info, I’m not sure how that would help.

This display in use is a QNIX UHD32167R monitor and am currently using the DisplayPort(DP) input connector to a GTX1060 DisplayPort(DP) output connector. There is no setting to enable/disable 10-bit as the AHVA panel is 10-bit via 8-bit+FRC(frame rate control). AFAIK, if the driver does not send it the signals to use the FRC for the extended colors, it simply does not use the FRC.

However, while using windOS it is necessary to enable 10-bit colors output via the nVidia control panel => Change Resolution/Output Color Depth to 10bpc.
Don’t know (and really do not care) what DirectX & windOS have or do in regards to 10bpc as it will come with caveats(otherwise known to honest people as LIES) for the OS or DX version(or something else that requires purchasing more SW and/or HW).

For Linux, the xorg.conf (or whatever Xorg’s configuration file is called for the particular distribution in use) must be modified(IIRC, set “Depth” & “DefaultDepth” to 30 in “Screen” section).
I have been waiting on the 378.xx driver for a fix to FAH in Linux and it may have a setting in the Linux control panel( “nvidia-settings” ) now to en/dis/able 10bpc. …one can hope. … (I’m stuck in windOS for another day so cannot say what the 378.13 or 375.39 driver does.)

Also, one needs additional color palettes that are not part of a normal install for any distribution. They are available however.
Once the extra colors prove to be useful/valuable for general usage,all necessary items will be installed and setup by default. If one can judge the, IMO, enormous amount of time it took for SMP & 64-bit systems to become commonplace as a measure for how long it will take for 10-bit colors to be standard …well, don’t hold breath, ;).

Good Luck & Have Fun!