Disturbing problems with decoding under Unity Engine

I wrote a library that wraps cuvid functionality. I made a small C++ demo that uses the lib. It works all fine. No problems at all.

Now, I made a plugin for Unity Engine that uses this lib. And it has some problems. I get no runtime errors, neither from CUDA nor cuvid. But for some unknown reasons the first 6-7 frames I decode are empty. In function HandlePictureDisplay I call all cuvidMapVideoFrame/cuvidGetDecodeStatus/cudaMemcpy/cuvidUnmapVideoFrame and none of these functions return any error. But I start getting any actual data after 6 frames or so. The frist 6-7 frames are either filled with 0’s or 1’s (talking 0-255 range)). This doesn’t happen in my simple C++ demo.

There is actually one more problem. Chroma data is empty. On output I get a very “greenish” image as all chroma values are 1’s (talking 0-255 range).

Could this be the decoder bug? The difference between my demo C++ app and Unity is that well, Unity does a helluva of other things so maybe some render states affect this problem? I could provide source code for inspection if necessary.

I think I found the problem. Turns out that my plugin must have been running in a “place” in code (under Unity rendering engine) unfortunate enough that it inherited some stale render states, most notably a blend state. When my chroma shader was outputting, to R8G8 texture, data in form float4(chroma1, chroma2, 0.0f, 0.0f) on output I got only 1’s. When I changed that to output float4(chroma1, chroma2, 0.0f, 1.0f) all worked just fine. This lead me to think that some render state must not be right for my purposes. After forcefully setting blend state to default it seems to be working all right.