I want to write an OpenGL application which can display 10-bit image on linux through my 10-bit monitor. So I configured the X with 30 bits color depth in xorg.conf, then write a simple program main.cpp (5.1 KB) which will draw a grid with 32x32 cells from darkest to brightest luminance to verify the 10-bit display effect. See the attachment <main.cpp>.
At first, everything looks fine, the grayscale gradient stripe is very smooth, here is the screenshot of the sample program.
But when I enabled the option “ForceCompositionPipeline” in nvidia-settings(for eliminate screen tearing), the 10-bit effect seem to wore off to 8-bit. because the grayscale gradient stripe is no longer smooth, if disable the “ForceCompositionPipeline”, the 10-bit effect will come back immediately.
Where about is go wrong?
OS: CentOS 7.6 SP1
Display Card Driver: 430.14
Display Card: Quadro P1000
nvidia-bug-report.log.gz (1.2 MB)