Decode 10, 12-bit depth video problem in driver 436.30


I use CUDA and FFmpeg to decode videos, it was well on NV driver 430.36, but get wrong on 436.30.
I call function “cuvidCreateDecoder(CUvideodecoder *phDecoder , CUVIDDECODECREATEINFO *pdci)” and get “CUDA_ERROR_NOT_SUPPORTED” return
I try some videos in different spec and figure out maybe this function will get wrong when "Pdci.bitDepthMinus8"s varible more than 0.

video info testing spec:
MPEG1 (8-bit) 48x16 4080x4080
MPEG2 (8-bit) 48x16 4080x4080
H264 (8-bit) 48x16 4096x4096
HEVC (8-bit) 144x144 8192x8192
HEVC (10-bit) 144x144 8192x8192 -> Error
HEVC (12-bit) 144x144 8192x8192 -> Error
VP8 (8-bit) 48x16 4096x4096
VP9 (8-bit) 128x128 8192x8192
VP9 (10-bit) 128x128 8192x8192 -> Error
VP9 (12-bit) 128x128 8192x8192 -> Error

Maybe BitDepth 10 and 12 are not support in this version?
And I make sure this problem not happen in driver 430.36

Plugins version:
FFmpeg 4.1.3
CUDA 10.1

Hardware spec:
CPU : Intel® Core™ i7-7700 CPU @ 3.60GHz
GPU : NVIDIA GeForce RTX 2060