I use nvcuvid decoder to play 4K resolution h264 video,but I found if I send a packet that’s large than 2M bytes(it is common for 4K h264 video) to “cuvidParseVideoData()” function,then the decoder will not produce any decoded image,it seems the decoder is crashed,is it a bug of nvcuvid?
This is outside of my area of expertise. That said, the documentation I could find on NVCUVID states that it supports H.264 video in HD format (1080p), I do not see any mention of 4K support:
Please note that NVCUVID is deprecated, per notice at the following website:
NVCUVID is deprecated? It is to my surprise,so what is the newest video decoder on NVIDIA GPU?
“https://developer.nvidia.com/nvidia-video-codec-sd” is incorrect,please tell me the right URL,thanks!
Sorry, last character of the link inadvertently got lopped off. The correct URL is:
“Because of its improved performance and quality, NVIDIA is focusing all future encoding development on NVENC, which first added dedicated encoding hardware to the Kepler family of GPUs. NVENC is replacing the earlier CUDA software-based NVCUVENC driver module. On Quadro and Tesla, 341.05 is the only driver to date from R340 to include NVCUVENC. NVCUVENC will not be available with GeForce after R337.
We recommend that developers transition any applications using NVCUVENC to NVENC SDK for H.264 encoding. This requires a Kepler or Maxwell GPU to use.”
NVCUVENC is deprecated. I’m not sure NVCUVID is deprecated at this time. NVCUVENC is replaced by NVENC.
Thanks for catching that, txbob. Not sure how I got mixed up between NVCUVID and NVCUVENC. Sorry about that, I didn’t mean to spread any misinformation.
Is the problem I encountered caused by NVCUVID is not prepaired for 4K video?
I agree with what njuffa said. I’m not an expert on nvcuvid but I wouldn’t be surprised if it’s not ready to handle 4k video as-is. Did you read p5 of the nvcuvid.pdf link that njuffa provided?
Hi there, just to add information. The provided sample that comes with CUDA (at least 6.5). Sample named cudaDecodeGL, can indeed handle 4K video. You just need to increase the GPU memory in VideoDecoder::VideoDecoder, there’s a comment that says <<// Limit decode memory to 24MB (16M pixels at 4:2:0 = 24M bytes)>>, just increase the memory limit.
Got new info.