I’m currently working on a project to make a generic demonstration platform for displaying 10 bit HLG HDR content, using MPEG-DASH. I have a generic PC with an NVidia GeForce GT 1030 GPU to try this out on. I’m running the 390.59 drivers on Fedora 28 (using the RPMFusion-packaged driver).
I’m using nvdec to decode the video content, but I’m having issues when it comes to actually displaying the content. Currently, the only way I can do this is to convert the P016 frame that comes out of the NVDECODE API to RGBA16 and then use OpenGL to draw this onto the display. However, this is a lossy operation as I will lose the HDR aspects of the stream and as such will reduce the quality of the image.
I already have the HDMI output doing YCbCr, so is there any way of just passing the YUV data that has been decoded straight out of the HDMI out on the card without having to do that RGBA conversion?
I’ve been looking into things like using Kernel Mode Setting in Linux, available with other graphics drivers to directly write to displays. However, from what I now understand, the NVidia drivers don’t offer the same level of control. Certainly all the KMS and DRM utilities and tools I’ve got available to me don’t seem to support NVidia’s mode setting, and outside of some web articles and release notes saying that the NVidia driver now supports some form of KMS, I can’t seem to get it to work outside of a Gnome session.
Is the above even possible with the current state of the NVidia drivers? What about if I tried updating my setup to support the X.org 1.20 release and the new NVidia 396 drivers which supposedly supports HDR via the new X.org DeepColor extension? I’m struggling to find any documentation on how well this is supported at the moment. I’ve come from an embedded background where we commonly have dedicated HDMI sinks, so I’m not sure how it is supposed to work on a commodity PC platform.
Thanks in advance,