Video Codec - Buffer Formats supported w/ D3D11?

I’ve been using the NvEncoderD3D11 class provided in the Video Codec SDK to encode a passed-in D3D11 texture to HEVC for some time, but now want to bump up the bit-depth to 10. Looking at the NvEncoderD3D11 class, only the following formats appear supported:

  • NV_ENC_BUFFER_FORMAT_ARGB (passing in DXGI_FORMAT_B8G8R8A8_UNORM)
  • NV_ENC_BUFFER_FORMAT_NV12 (passing in DXGI_FORMAT_NV12)

What I had thought would also work, based on the existing convention, was NV_ENC_BUFFER_FORMAT_ABGR10 (passing in DXGI_FORMAT_R10G10B10A2_UNORM), but now I’m seeing that it’s not included in the provided code — Are the unimplemented Buffer Formats in the NvEncoder subclasses meant to indicate a lack of support on the SDK level? If not, where should one look to understand how to implement the encoding of the non-provided formats?

Thanks

Through further investigation:

  • I now see that NV_ENC_BUFFER_FORMAT_ARGB still results in a 4:2:0 subsampling which was unwanted - was expecting 4:4:4 though can see why it’s not a default. Now I’ve switched to handling the RGB->YUV conversion myself so I can then pass in the actual buffer format I want used, NV_ENC_BUFFER_FORMAT_YUV444 (and _10BIT variant).
  • The headers say that YUV444 is specified planar - there’s no DXGI_FORMAT for planar YUV, only packed YUV, so the Direct3D11 texture can only be an R16, but then the width or height needs to be 3x the image dimension to put the 3 planes in. If I do this though, I get a MemoryAccess exception in NvEncRegisterResource,

So am still unclear: can NVENC produce 4:4:4 video (10 bit or otherwise) through the DirectX path?

Hi @BlueprintBen ,

I’ve managed to produce 8-bits per channel 444 video with DirectX and NVENC by using the AYUV format (it is a packed format) but there’s a caveat: https://forums.developer.nvidia.com/t/unexpected-color-space-conversion-with-h-264-ayuv-format-444-yuv/197901
It looks like 10-bit 444 is not possible with DirectX.

Thanks for the information @patsku79. That’s something at least. So has that conversion issue been resolved in more recent drivers?

Looking at the D3D12 thread, it seems like YUV420_10BIT and ABGR10 are accepted (at least in D3D12), but in my testing BGRA10 is output as YUV420 which isn’t desired either - have you tested if YUV420_10BIT produces 10bit output?

Since it seems that the highest quality can’t be achieved through the DirectX path, I guess I’m interested in examples or tips on using the CUDA path with D3D11 interop to achieve YUV444_10BIT encoding from a D3D11 input texture.

The driver issue seems to persist so I’m still using the older drivers. In my case, I needed to perform the color conversion myself (or in fact avoid it altogether) so I needed a way to submit YUV format textures to the encoder. But I’m surprised to learn that submitting RGB (or BGRA) format textures does not produce 444 video when NV_ENC_BUFFER_FORMAT_YUV444 format is specified. Did you make sure to use the correct profile (NV_ENC_H264_PROFILE_HIGH_444_GUID if using H.264) and set chromaFormatIDC=3?

Regarding the 420 outcome, I was referring to the use of NV_ENC_BUFFER_FORMAT_ARGB rather than AYUV and that it produced a 420 output (I didn’t try setting PROFILE_HIGH_444 or IDC=3 though).

I did try NV_ENC_BUFFER_FORMAT_AYUV and see the same issue of the data being treated as RGB still, though I can at least just pass in RGB for the time being. Otherwise it is working as you described so I am at least happy to have achieved 4:4:4 output. Thanks for your assistance there.