AppDec Sample not working @ GeForce 1080 Ti

Good morning everyone,

So I’m trying the new 8.1 version of this video SDK from Nvidia. I managed to compile it in a Linux distribution with cuda 9 and driver version 390.48. The machine is running a 1080 Ti GPU and I’m doing some tests for a research I’m conducting. The point is, I managed to encode a couple of videos with two different resolutions, 4096x2048 and 2048x1024, with different presets using HEVC as the codec. Coding works perfectly fine, but decoding is giving my a pile of problems:

1.- Decoding a video with a chroma subsampling of 444 gives me the following warning+error:
Warning: [hevc @ 0x3940260] Stream #0: not enough frames to estimate rate; consider increasing probesize
Error: HandleVideoSequence : Codec not supported on this GPU at …/…/NvCodec/NvDecoder/NvDecoder.cpp:146

2.- Decoding a 4096x2048 video with a chroma subsampling of 420: it recovers half of the information, and then throws an error and the same warning as the one before:
Warning: [hevc @ 0x3462440] Stream #0: not enough frames to estimate rate; consider increasing probesize
Error:[NULL @ 0x3463620] missing picture in access unit

3.- When I decode a 2048x1024 video with a chroma subsampling of 420 and try to reproduce it, it is like if it lost some color in the process as the color turns to be greyed instead of fully colourish (it still gives the warning regarding the “[hevc @ 0x3940260] Stream #0: not enough frames to estimate rate; consider increasing probesize”). I looked at the compressed files with VLC player and they look good, they play with the expected colors so the issue is in the Decoder.

Moreover, I looked into the code where the errors are happening when decoding 4:4:4 videos and it is related to the recovering of the information regarding the GPU compatibility with the codec. I tried to hardcode some values to overcome the conditions of breaking but it still breaks in the end, and knowing that the decoding of videos with 2048x1024 at 420 sampling are getting wrong results, I stopped looking further.

Any suggestions to this guys? Maybe I’m doing something wrong?

Thanks a lot for all your help!

Raw streams does not contain “frame rate” parameter (set in container like mkv/avi/mp4…). Decoder does not depend on “frame rate”. Presenter/player of decoded images usually try to derive “frame rate” by analyzing timestamps PTS in stream and if PTS is not preset or cleared it is impossible to derive “frame rate” and player can continue in default “frame rate” or “frame rate” that is set externally (for example from command line).

Try to check capabilities with cuvidGetDecoderCaps() on your HW. But I suppose that HW NVDEC decoder can handle 420 streams only (eg. Main/Main10/Main12 profiles to Level 5.1 and NOT other profiles - see https://en.wikipedia.org/wiki/High_Efficiency_Video_Coding#Profiles).

Hi mcerveny! First of all, thanks for answering, let me answer your notes:

1.- We agree that the decoder do not depend on frame rates, as having a raw video is just consisting of consecutive pixels one after another. I guess that warning shouldn’t affect the decoding process.

2.- Regarding compatibility, when you fire up the HW NVDEC from Nvidia with the help mode (-h), it shows you the current support for the GPU used. For me, these are the capabilities:

[url]https://imgur.com/a/vVhCNsb[/url]

Apparently it is capable of decoding any video which is not higher than 8192x8192 with less than 255GB of total filesize. My largest video is 70GB of size, with 4096x2048 of size per frame. I am not that concerned about the decoding at 4:4:4, maybe as you suggest the application is not ready to work with chroma of 444. Still, my main issue is the decoding of the video with 2048x1024 of sizing, which a total filesize of 17GB in 4:2:0. While it doesn’t give any error, the output is not the expected (every frame gets a greyish color). This is what is stopping me from taking benchmarks and timings.

MB is not measure of MegaBytes but encoding MacroBlocks (see https://en.wikipedia.org/wiki/Macroblock). Decoder can decode arbitrary length of video stream.

You should double check output buffer offset of UV color information (chrominance). Maybe you are displaying only Y part (luminance) (see https://en.wikipedia.org/wiki/YUV). Or you are facing “washed” colors problem (eg. limited/full color range problem https://kodi.wiki/view/Video_levels_and_color_space - search the Internet for hints).

Nice to know, I was skeptical about the Decoder having a limit related with the filesize, now it makes sense. Thanks!!

Ok so I made a discovery here. In the decoder, the application lets you flag the decoding process with the “-outplanar” flag. Doing so, the decoder says that the output will be planar (which is already planar without setting the flag). However, when I tested it, and though both the previous video which presents that greyish color and the new are planar, the new one decoded with the -outplanar flag DOES have the expected color palette. Reason? No idea at all, it seems like a decoding issue from the nvidia decoder. Both outputs are planar, with or without the -outplanar flag, but setting it outputs the correct color for the video. I think this could be a bug?

Regardless, I noticed another thing. If I use the preset “lossless” in the encoding process, the output is not lossless, but lossy (the PSNR is not infinite).

It is not correct to state that both are planar. Without the -outplanar flag, the output is NV12, which has interleaved U and V (google NV12 to see the memory layout). With -outplanar, the U and V are deinterleaved.

“Planar” means that there are three planes (blocks of contiguous memory), one with all the Y, one with all the U, and one with all the V.

Now this makes sense to me, thanks a lot for the insight! It was leading me to error because when I was trying to convert the yuv file to raw rgb, I was considering it to be planar and still gave me a planar video but with washed colors. Now it makes more sense. I have not gone deep into the HEVC standard, sorry for my lack of knowledge.

Thanks both of you for your help :)

No problem, Dread13! We are all always learning, sometimes the hard way! Good luck with your projects and post again if we can help with anything.

“Error: HandleVideoSequence : Codec not supported on this GPU at …/…/NvCodec/NvDecoder/NvDecoder.cpp:146”

Im having the same problem and I dont understand what causing it, does it mean NVENC can encode 4:4:4 but cuvid / NVDEC cannot decode the 4:4:4? Is there anything similar to achieve the quality related to enabling 4:4:4?