I have problems decoding 4K video (8-bit 3840x2160) on a GTX 960 card. The decoder seems to decode only the top 503 lines. When I look at the decoded NV12 video data, the first 503 rows have Y values > 0, while starting with row 504 all the Y values are 0.
When I look at the CbCr data, the first 247 rows have values close to 80 (the first frame is mostly black). But starting with row 248, they are all 0.
The data is 4096 bytes aligned. So it looks like this:
ptr + 4096 * 0 … ptr + 4096 * 503 are all good (0x0F)
ptr + 4096 * 504 … ptr + 4096 * 2159 are all 0s
ptr + 4096 * 2160 … ptr + 4096 * 2407 are all good (0x80)
ptr + 4096 * 2408 … ptr + 4096 * 3239 are all 0s
I believe this card has GM206 chipset (although I don’t know how to determine this, since the NVIDIA specifications page doesn’t mention anything about chipset). If it is, indeed GM206, then according to https://developer.nvidia.com/nvidia-video-codec-sdk#NVDECFeatures, this card should be able to handle video up to 10-bit 4096x2304. So 3840 x 2160 8-bit should work.
A second problem is that this card fails to decode 1080p 10-bit video. It seems to think it is only 8 bit and the decoded data is all garbage.