Currently, VDPAU does not seem to support HEVC Main10 at all even where the hardware should support it. (For example, vdpauinfo returns “— not supported —” on HEVC_MAIN_10 on a GTX 960)
From what I understand, HEVC Main10 is supported via other hardware decoding APIs (e.g. d3d11va) on suitable hardware. Since Main10 seems to be the most common profile from the HEVC test clips I’ve seen, coupled with the fact that main10 is the only profile my CPU is too slow to decode in realtime, this would be a pretty neat feature to have.
There are two issues with HEVC_MAIN_10. First, you’re right that the driver doesn’t support it yet. For future reference, that’s tracked in internal bug 1617735.
The other issue is an API one: VDPAU doesn’t provide a way to allocate, render to, or display a 10-bit surface. So even if bug 1617735 is fixed, the driver would have to dither down to 8 bits for display. Internal bug 1632828 tracks adding 10-bit surface support to the API and the driver.
Sorry I don’t have better news with regard to enabling these features.
The other issue is an API one: VDPAU doesn’t provide a way to allocate, render to, or display a 10-bit surface. So even if bug 1617735 is fixed, the driver would have to dither down to 8 bits for display. Internal bug 1632828 tracks adding 10-bit surface support to the API and the driver.
UltraHD (BT.2020) completely deprecates interlaced content, so it’s likely we will never see it in the future ever again - certainly not with HEVC content. If you are going to redesign VDPAU to support 10-bit surfaces and add a new API for this, it might be worth dropping support for interlacing in that API/implementation as well, and just map frames directly.
“Added support for VDPAU Feature Set H to the NVIDIA VDPAU driver. GPUs with VDPAU Feature Set H are capable of hardware-accelerated decoding of 8192x8192 (8k) H.265/HEVC video streams.”
Dithering down to 8 bit is A-OK. Most users don’t have a 30 bit display, and even if they do they often haven’t activated it because some GL applications get confused by it anyway.
HEVC-10 bit is the most common encoding since it’s used by BluRay. It’s also superior to 8 bit encoding even when the final target is an 8 bit display since the 10 bit HEVC stream is better compressed than an 8 bit one with dither added pre-encoding.
So yes, please, implement Linux 10 bit HEVC decoder support, and just dither it post-decode to 8 bits to fit the VDPAU API.
Dithering down to 8 bit is OK for the VDPAU mixer (+conversion for RGB, note that the driver should only dither after colorspace conversions and tone mapping - not before) and display functions, but dithering down to 8 bit (or touching the video surface in any other way) is NOT OK when mapping the decoded planes directly.
For them to be useful and reliable, hardware decoders for formats with a fixed specification (e.g. H.264, HEVC) MUST output the same (identical) result as an equivalent software decoder. Doing anything else has a number of drawbacks. This is especially important for HDR content, for which dithering to 8-bit would basically completely destroy the image.
haasn, you’re completely correct, ideally we want full native 10/12 bit and HDR support. But such support needs to be added in VDPAU first, and that may take years. Dithering down to 8 bit is something in NVidia’s control, can be done immediately, and is very useful as a solution for now. NVidia can update the decoder later when VDPAU extends its API to 10/12 bit and HDR.
I have a 4K display and GTX 960 with HEVC10 decoder. Yet I can’t play HEVC 10 bit content on Linux (with CPU decoding) without stuttering. On Windows the GTX 960 can decode it at CPU idle. It’s worth getting that hardware decode working in Linux now, living with today’s VDPAU limitations, and adding HDR and 30/36 bit native support in future years when it’s possible.
Wow. Thats like a slap in the face :(
AMD (with their in my opinion horrible linux support) get HEVC 10 bit decode with VDPAU and the NVIDIA users have still to wait (maybe Months)?
I have been waiting for what feels like forever for this in the end i brought a 35 pound android tv box off ebay it plays main 10 and even hdr i would advise every one to do the same as nvidia are in no rush to get this to us sucks that i have a 400 pound pascal card and no driver support.
The cuvid/nvdecode library in the latest 375.20 drivers has support for both 12bit decode and P016 output surfaces - of course, these won’t be documented until the next sdk release, but I’ve been habitually testing for it to show up and then it finally did.
I’ve got pending changes for ffmpeg and mpv to support it, so there’s a solution here if you’re prepared to abandon vdpau (insert joke that nvidia already did).
Hi i have noticed others getting hardware acceleration using cuvid/nvdecode but not been able to get it working my self what kind of player are they using for this and would it be possible to get working with kodi ?
using Linux and a GTX 950 I had fallen to the conviction that my PC was able to play HEVC h.265 content flawlessly. Accordingly I face heavy stuttering when playing such a video file.
Executing vdpauinfo I had to learn that only the base profile is supported, not the 10 Bit HEVC profile. Because I go with a light weight dual core there is not enough CPU-Power to do the software encoding.
I am wondering, if I was to buy new hardware components what would be the best choice to get a fully HEVC compatible HTPC running right now?
Should I upgrade to GTX 1050/ti and will that work with Linux?
Should I switch to AMD and their current RX-Series? Heard AMD has bad Linux drivers but never tested an AMD-GPU with Linux personally…
Should I go with Kaby Lake’s iGPU? Intel says It supports HEVC too, but is that so with Linux?
Thanks for your advice ;)
And BTW: If you find a solution to make my GTX 950 render that HEVC it’s still welcome !!!
Since last month, the VDPAU API now has a representation of 10- and 12-bit surfaces and the various other datatypes required for HEVC Main 10/12.
This was in release 1.4 and has now hit the repositories of several Linux distributions.
When might we expect to see the relevant functionality exposed in the driver? Right now vdpauinfo shows that only HEVC is supported on hardware that supports HEVC Main 10/12 via NVDEC.