VDPAU: Expose HEVC Main10 support where available on-die

Currently, VDPAU does not seem to support HEVC Main10 at all even where the hardware should support it. (For example, vdpauinfo returns “— not supported —” on HEVC_MAIN_10 on a GTX 960)

From what I understand, HEVC Main10 is supported via other hardware decoding APIs (e.g. d3d11va) on suitable hardware. Since Main10 seems to be the most common profile from the HEVC test clips I’ve seen, coupled with the fact that main10 is the only profile my CPU is too slow to decode in realtime, this would be a pretty neat feature to have.

There are two issues with HEVC_MAIN_10. First, you’re right that the driver doesn’t support it yet. For future reference, that’s tracked in internal bug 1617735.

The other issue is an API one: VDPAU doesn’t provide a way to allocate, render to, or display a 10-bit surface. So even if bug 1617735 is fixed, the driver would have to dither down to 8 bits for display. Internal bug 1632828 tracks adding 10-bit surface support to the API and the driver.

Sorry I don’t have better news with regard to enabling these features.



What about HEVC Main12 which is supported by Pascal’s HEVC hardware decoder?

Will the driver & VDPAU API be updated to support Main12 also?

The other issue is an API one: VDPAU doesn’t provide a way to allocate, render to, or display a 10-bit surface. So even if bug 1617735 is fixed, the driver would have to dither down to 8 bits for display. Internal bug 1632828 tracks adding 10-bit surface support to the API and the driver.

It might be worth linking this issue to the one I mentioned in https://devtalk.nvidia.com/default/topic/940217/linux/progressive-mode-vdpau-implementation-and-api-/

UltraHD (BT.2020) completely deprecates interlaced content, so it’s likely we will never see it in the future ever again - certainly not with HEVC content. If you are going to redesign VDPAU to support 10-bit surfaces and add a new API for this, it might be worth dropping support for interlacing in that API/implementation as well, and just map frames directly.


“Added support for VDPAU Feature Set H to the NVIDIA VDPAU driver. GPUs with VDPAU Feature Set H are capable of hardware-accelerated decoding of 8192x8192 (8k) H.265/HEVC video streams.”

Pascal is VDPAU Feature Set H hardware it seems.


“VDPAU does not currently support the HEVC Main 12 profile.”


display: :0   screen: 0
API version: 1
Information string: NVIDIA VDPAU Driver Shared Library  367.27  Thu Jun  9 18:23:31 PDT 2016

Video surface:

name   width height types
420     8192  8192  NV12 YV12 
422     8192  8192  UYVY YUYV 

Decoder capabilities:

name                        level macbs width height
MPEG1                           0 65536  4096  4096
MPEG2_SIMPLE                    3 65536  4096  4096
MPEG2_MAIN                      3 65536  4096  4096
H264_BASELINE                  41 65536  4096  4096
H264_MAIN                      41 65536  4096  4096
H264_HIGH                      41 65536  4096  4096
VC1_SIMPLE                      1  8190  2048  2048
VC1_MAIN                        2  8190  2048  2048
VC1_ADVANCED                    4  8190  2048  2048
MPEG4_PART2_SP                  3  8192  2048  2048
MPEG4_PART2_ASP                 5  8192  2048  2048
DIVX4_QMOBILE                   0  8192  2048  2048
DIVX4_MOBILE                    0  8192  2048  2048
DIVX4_HOME_THEATER              0  8192  2048  2048
DIVX4_HD_1080P                  0  8192  2048  2048
DIVX5_QMOBILE                   0  8192  2048  2048
DIVX5_MOBILE                    0  8192  2048  2048
DIVX5_HOME_THEATER              0  8192  2048  2048
DIVX5_HD_1080P                  0  8192  2048  2048
H264_CONSTRAINED_BASELINE      41 65536  4096  4096
H264_EXTENDED                  41 65536  4096  4096
H264_PROGRESSIVE_HIGH          41 65536  4096  4096
H264_CONSTRAINED_HIGH          41 65536  4096  4096
H264_HIGH_444_PREDICTIVE       41 65536  4096  4096
HEVC_MAIN                      153 262144  8192  8192
HEVC_MAIN_10                   --- not supported ---
HEVC_MAIN_STILL                --- not supported ---
HEVC_MAIN_12                   --- not supported ---
HEVC_MAIN_444                  --- not supported ---

Output surface:

name              width height nat types
B8G8R8A8         32768 32768    y  Y8U8V8A8 V8U8Y8A8 A4I4 I4A4 A8I8 I8A8 
R10G10B10A2      32768 32768    y  Y8U8V8A8 V8U8Y8A8 A4I4 I4A4 A8I8 I8A8 

Bitmap surface:

name              width height
B8G8R8A8         32768 32768
R8G8B8A8         32768 32768
R10G10B10A2      32768 32768
B10G10R10A2      32768 32768
A8               32768 32768

Video mixer:

feature name                    sup
INVERSE_TELECINE                 y
NOISE_REDUCTION                  y
SHARPNESS                        y
LUMA_KEY                         y

parameter name                  sup      min      max
VIDEO_SURFACE_WIDTH              y         1     8192
VIDEO_SURFACE_HEIGHT             y         1     8192
CHROMA_TYPE                      y  
LAYERS                           y         0        4

attribute name                  sup      min      max
BACKGROUND_COLOR                 y  
CSC_MATRIX                       y  
NOISE_REDUCTION_LEVEL            y      0.00     1.00
SHARPNESS_LEVEL                  y     -1.00     1.00
LUMA_KEY_MIN_LUMA                y  
LUMA_KEY_MAX_LUMA                y

Dithering down to 8 bit is A-OK. Most users don’t have a 30 bit display, and even if they do they often haven’t activated it because some GL applications get confused by it anyway.
HEVC-10 bit is the most common encoding since it’s used by BluRay. It’s also superior to 8 bit encoding even when the final target is an 8 bit display since the 10 bit HEVC stream is better compressed than an 8 bit one with dither added pre-encoding.

So yes, please, implement Linux 10 bit HEVC decoder support, and just dither it post-decode to 8 bits to fit the VDPAU API.

Dithering down to 8 bit is OK for the VDPAU mixer (+conversion for RGB, note that the driver should only dither after colorspace conversions and tone mapping - not before) and display functions, but dithering down to 8 bit (or touching the video surface in any other way) is NOT OK when mapping the decoded planes directly.

For them to be useful and reliable, hardware decoders for formats with a fixed specification (e.g. H.264, HEVC) MUST output the same (identical) result as an equivalent software decoder. Doing anything else has a number of drawbacks. This is especially important for HDR content, for which dithering to 8-bit would basically completely destroy the image.

haasn, you’re completely correct, ideally we want full native 10/12 bit and HDR support. But such support needs to be added in VDPAU first, and that may take years. Dithering down to 8 bit is something in NVidia’s control, can be done immediately, and is very useful as a solution for now. NVidia can update the decoder later when VDPAU extends its API to 10/12 bit and HDR.

I have a 4K display and GTX 960 with HEVC10 decoder. Yet I can’t play HEVC 10 bit content on Linux (with CPU decoding) without stuttering. On Windows the GTX 960 can decode it at CPU idle. It’s worth getting that hardware decode working in Linux now, living with today’s VDPAU limitations, and adding HDR and 30/36 bit native support in future years when it’s possible.

So, five months later can you tell us where you are regarding those issues ?
HEVC 10 bit is currently working with amd cards using vdpau : https://www.reddit.com/r/linux/comments/51csy9/amd_rx_460_for_4k_video/
Now should I wait forever or switch hardware ?

Can you at least give us an update ?

Wow. Thats like a slap in the face :(
AMD (with their in my opinion horrible linux support) get HEVC 10 bit decode with VDPAU and the NVIDIA users have still to wait (maybe Months)?

That sounds like a bad joke ;(

I have been waiting for what feels like forever for this in the end i brought a 35 pound android tv box off ebay it plays main 10 and even hdr i would advise every one to do the same as nvidia are in no rush to get this to us sucks that i have a 400 pound pascal card and no driver support.

The cuvid/nvdecode library in the latest 375.20 drivers has support for both 12bit decode and P016 output surfaces - of course, these won’t be documented until the next sdk release, but I’ve been habitually testing for it to show up and then it finally did.

I’ve got pending changes for ffmpeg and mpv to support it, so there’s a solution here if you’re prepared to abandon vdpau (insert joke that nvidia already did).

Hi i have noticed others getting hardware acceleration using cuvid/nvdecode but not been able to get it working my self what kind of player are they using for this and would it be possible to get working with kodi ?

I haven’t been able to decode with cuda through MPV but QMPlay2 seems to work well for me. I don’t think it is possible with Kodi as of now.

Hi thanks i will give that a try :)

Thanks! mpv with hwdec=cuda is working fine!
I’ve built the latest ffmpeg and mpv from git and cuda from the sdk (7.1.9)

This file runs fine: http://demo-uhd3d.com/fiche.php?cat=uhd&id=132
In software it was a complete drama :)

Hi Guys,

using Linux and a GTX 950 I had fallen to the conviction that my PC was able to play HEVC h.265 content flawlessly. Accordingly I face heavy stuttering when playing such a video file.

Executing vdpauinfo I had to learn that only the base profile is supported, not the 10 Bit HEVC profile. Because I go with a light weight dual core there is not enough CPU-Power to do the software encoding.

I am wondering, if I was to buy new hardware components what would be the best choice to get a fully HEVC compatible HTPC running right now?

Should I upgrade to GTX 1050/ti and will that work with Linux?
Should I switch to AMD and their current RX-Series? Heard AMD has bad Linux drivers but never tested an AMD-GPU with Linux personally…
Should I go with Kaby Lake’s iGPU? Intel says It supports HEVC too, but is that so with Linux?

Thanks for your advice ;)

And BTW: If you find a solution to make my GTX 950 render that HEVC it’s still welcome !!!

Has this issue of no support for HEVC Main 10 profiles been fixed in the linux drivers?

Is the a public link to the bugs 1617735 and 1632828 mentioned by Aaron?


Since last month, the VDPAU API now has a representation of 10- and 12-bit surfaces and the various other datatypes required for HEVC Main 10/12.

This was in release 1.4 and has now hit the repositories of several Linux distributions.

When might we expect to see the relevant functionality exposed in the driver? Right now vdpauinfo shows that only HEVC is supported on hardware that supports HEVC Main 10/12 via NVDEC.

1 Like