Is there Nvidia Encoder/Decoder which supports HDR (10 bpp) for AVC/H.264?

Hi Nvidia community

i looked the table in NVIDIA VIDEO CODEC SDK | NVIDIA Developer and found that Nvidia hw supports HDR for HEVC encoding only. As per AVC/H.264 encoding only 8 bpp is supported.

Is there Nvidia Encoder/Decoder which supports HDR (10 bpp) for AVC/H.264?

Hi there!

Unfortunately, I don’t think anyone ever implemented hardware encoding/decoding for H.264 Hi10P (or the other profiles that support higher bit depths).

The anime fansub community adopted H.264 Hi10P because it lessened color banding at reasonable bitrates. Unfortunately, there weren’t common applications for 10-bit at the time, so the industry didn’t bother with hardware support for it. There are specialized professional solutions, but nothing that made it into consumer chips.

HEVC (which had 10-bit support from the outset) was ready by the time HDR-capable displays became widespread, so chip designers and content producers focused on that instead.

H.264’s 10-bit profiles are thought to be a dead-end, although there are interesting rumors about Turing’s NVENC/NVDEC…we’ll have to wait and see.

Hi sriabtsev,

KiriNotes is correct, NVIDIA’s hardware and software stack don’t support H264, 10bpp. For HDR you need to use HEVC which supports 10bpp in our encoder and decoder.

Thanks,
Ryan Park

There is Action! Recording and streaming software that supports Nvidia HDR10 Codec

10 bit has nothing to do with HDR. HDR is PQ tfanfer function. Sigh.