So, keep in mind that 10bit decode is not officially supported by nvidia, and the ability to turn it on is undocumented.
It’s quite possible there’s a bug in there somewhere - we really need to wait until the Video SDK 8.0 gets released and they officially designate a working driver version.
I’ve not seen this kind of corruption on any sample except yours. How was it encoded? Is it using any of the fancier hevc features?