CentOS 7.9
CUDA 11.3 (cuda-repo-rhel7-11-3-local-11.3.0_465.19.01-1.x86_64.rpm)
NVIDIA GTX 460 (GF104) card
NVIDIA Legacy Driver 390.143 (NVIDIA-Linux-x86_64-390.143.run)
Video_Codec_SDK_11.0.10
As in the sample code NvDecoder.cpp, when I call cuvidGetDecoderCaps(), I get the following:
CUDA_VIDEO_DECODER constructor.
CUDA_VIDEO_DECODER found 1 NVIDIA devices:
CUDA_VIDEO_DECODER NVIDIA_DEVICE[00] <GeForce GTX 460>
CUDA_VIDEO_DECODER using NVIDIA_DEVICE[00] <GeForce GTX 460>
CUDA_VIDEO_DECODER device capabilities as follows:
Codec <JPEG>, BitDepth <8>, ChromaFormat <4:2:0>, Supported <1>, Max <32768x16384>, MaxMBCount <67108864>, Min <64x64>, SurfaceFormat <0x00000000>-<N/A>
Codec <MPEG1>, BitDepth <8>, ChromaFormat <4:2:0>, Supported <1>, Max <2032x2032>, MaxMBCount <8192>, Min <48x16>, SurfaceFormat <0x00000000>-<N/A>
Codec <MPEG2>, BitDepth <8>, ChromaFormat <4:2:0>, Supported <1>, Max <2032x2032>, MaxMBCount <8192>, Min <48x16>, SurfaceFormat <0x00000000>-<N/A>
Codec <MPEG4>, BitDepth <8>, ChromaFormat <4:2:0>, Supported <1>, Max <2032x2032>, MaxMBCount <8192>, Min <48x16>, SurfaceFormat <0x00000000>-<N/A>
Codec <H264>, BitDepth <8>, ChromaFormat <4:2:0>, Supported <1>, Max <2032x2032>, MaxMBCount <8192>, Min <48x16>, SurfaceFormat <0x00000000>-<N/A>
Codec <HEVC>, BitDepth <8>, ChromaFormat <4:2:0>, Supported <0>, Max <0x0>, MaxMBCount <0>, Min <0x0>, SurfaceFormat <0x00000000>-<N/A>
Codec <HEVC>, BitDepth <10>, ChromaFormat <4:2:0>, Supported <0>, Max <0x0>, MaxMBCount <0>, Min <0x0>, SurfaceFormat <0x00000000>-<N/A>
Codec <HEVC>, BitDepth <12>, ChromaFormat <4:2:0>, Supported <0>, Max <0x0>, MaxMBCount <0>, Min <0x0>, SurfaceFormat <0x00000000>-<N/A>
Codec <HEVC>, BitDepth <8>, ChromaFormat <4:4:4>, Supported <0>, Max <0x0>, MaxMBCount <0>, Min <0x0>, SurfaceFormat <0x00000000>-<N/A>
Codec <HEVC>, BitDepth <10>, ChromaFormat <4:4:4>, Supported <0>, Max <0x0>, MaxMBCount <0>, Min <0x0>, SurfaceFormat <0x00000000>-<N/A>
Codec <HEVC>, BitDepth <12>, ChromaFormat <4:4:4>, Supported <0>, Max <0x0>, MaxMBCount <0>, Min <0x0>, SurfaceFormat <0x00000000>-<N/A>
Codec <VC1>, BitDepth <8>, ChromaFormat <4:2:0>, Supported <1>, Max <2032x2032>, MaxMBCount <8192>, Min <48x16>, SurfaceFormat <0x00000000>-<N/A>
Codec <VP8>, BitDepth <8>, ChromaFormat <4:2:0>, Supported <0>, Max <0x0>, MaxMBCount <0>, Min <0x0>, SurfaceFormat <0x00000000>-<N/A>
Codec <VP9>, BitDepth <8>, ChromaFormat <4:2:0>, Supported <0>, Max <0x0>, MaxMBCount <0>, Min <0x0>, SurfaceFormat <0x00000000>-<N/A>
Codec <VP9>, BitDepth <10>, ChromaFormat <4:2:0>, Supported <0>, Max <0x0>, MaxMBCount <0>, Min <0x0>, SurfaceFormat <0x00000000>-<N/A>
Codec <VP9>, BitDepth <12>, ChromaFormat <4:2:0>, Supported <0>, Max <0x0>, MaxMBCount <0>, Min <0x0>, SurfaceFormat <0x00000000>-<N/A>
Codec <AV1>, BitDepth <8>, ChromaFormat <4:2:0>, Supported <0>, Max <0x0>, MaxMBCount <0>, Min <0x0>, SurfaceFormat <0x00000000>-<N/A>
Codec <AV1>, BitDepth <10>, ChromaFormat <4:2:0>, Supported <0>, Max <0x0>, MaxMBCount <0>, Min <0x0>, SurfaceFormat <0x00000000>-<N/A>
Codec <AV1>, BitDepth <8>, ChromaFormat <4:0:0>, Supported <0>, Max <0x0>, MaxMBCount <0>, Min <0x0>, SurfaceFormat <0x00000000>-<N/A>
Codec <AV1>, BitDepth <10>, ChromaFormat <4:0:0>, Supported <0>, Max <0x0>, MaxMBCount <0>, Min <0x0>, SurfaceFormat <0x00000000>-<N/A>
CUDA_VIDEO_DECODER constructor complete.
decodeCaps.nOutputFormatMask is always 0.
This causes the sample programs to fail like:
terminate called after throwing an instance of ‘NVDECException’
what(): HandleVideoSequence : No supported output format found at …/…/…/src/common/video/nvidia/Video_Codec_SDK_11.0.10/Samples/NvCodec/NvDecoder/NvDecoder.cpp:270
The related code is in NvDecoder::HandleVideoSequence(CUVIDEOFORMAT *pVideoFormat)
// Check if output format supported. If not, check falback options if (!(decodecaps.nOutputFormatMask & (1 << m_eOutputFormat))) { if (decodecaps.nOutputFormatMask & (1 << cudaVideoSurfaceFormat_NV12)) m_eOutputFormat = cudaVideoSurfaceFormat_NV12; else if (decodecaps.nOutputFormatMask & (1 << cudaVideoSurfaceFormat_P016)) m_eOutputFormat = cudaVideoSurfaceFormat_P016; else if (decodecaps.nOutputFormatMask & (1 << cudaVideoSurfaceFormat_YUV444)) m_eOutputFormat = cudaVideoSurfaceFormat_YUV444; else if (decodecaps.nOutputFormatMask & (1 << cudaVideoSurfaceFormat_YUV444_16Bit)) m_eOutputFormat = cudaVideoSurfaceFormat_YUV444_16Bit; else NVDEC_THROW_ERROR("No supported output format found", CUDA_ERROR_NOT_SUPPORTED); }
Why always 0?
Any clues?