Hardware acceleration using DXVA2 on Quadro cards

Hello,
I don’t know if this is the right place to ask, i will try…
I am using DXVA2 on win 10 to accelerate video on the graphic cards, for video HEVC/H.265 videos.

With my flow, cards that not support the hardware codec([url]https://developer.nvidia.com/video-encode-decode-gpu-support-matrix[/url]), will not used the hardware acceleration, BUT for some reason for unsupported cards, the “card” is cheating, and return that it’s support the codec.

For example, Quadro M1000M which from the supported matrix doesn’t support HEVC, DXVA threat that supporting hardware decoding.
When testing this with “DXVA Checker” there is same result.

I find that there is feature of “PureVideo” that will do hybrid decoding ,[url]https://devtalk.nvidia.com/default/topic/1031720/video-codec-and-optical-flow-sdk/video-sdk-hybrid-decode-hevc/post/5252583/#5252583[/url].

so my question is, is there any way to disable this and let my software to only CPU decoding?

For Maxwell generation GPUs, the GPU falls back to hybrid decoder implemented using CPU and CUDA when using DXVA decoding. There is currently no way to explicitly disable this.

May I know the reason why you would like to disable it?

Thanks Patait for the answer.
From that answer i was worried :), in my software i can control if i want to use CPU decoding or use hardware decoding using DXVA2, my concept was that if the GPU not support the codec, DXVA2 will not return a valid decoder device GUID, and then i will fallback to CPU decoding [url]https://docs.microsoft.com/en-us/windows/win32/api/dxva2api/nf-dxva2api-idirectxvideodecoderservice-getdecoderdeviceguids[/url].
(I can say that it’s working for some other cards K600 for example for few HEVC profiles)

There is degraded in performance when this hybrid decoding is taking place, it’s better for me to use CPU over it, but i cannot get indication for that as describe above.

I Will be happy for some sort of solution for me.

Asaf

Can you check the GPU model and avoid using the hybrid decoder? Fully-accelerated HEVC hardware decoding is supported only starting GPUs GM20x. You can check the device IDs. I understand this is not an ideal solution, but that’s the only possible solution I can suggest at the moment.

Thank you.
I will take your advise and try to avoid those cards, using a black list.

Is there any chance i can talk with nVIDIA driver persons to look if there is a hidden configuration that can disable this (Feature Set E)?