Hello.
On a Orin Nano, I am trying to use the NvBufSurfTransformCompositeBlend API to alpha-blend two images into a destination buffer. There are no examples of this API usage, so I am trying to figure it out.
No matter what I try, I get the message “VIC operation is not supported”. So, either this API does not work on an Orin Nano, or I am using the API incorrectly.
I realize the buffers passed into the function need to have the same width and height. Do they need the same depth? For src0 and src1, I have buffers that are ARGB. For alpha, I have a buffer that is GRAY8 with the alpha assigned to the single channel. For the destination, I have a buffer that is ARGB. I tried having all four buffers as ARGB, but that didn’t work either.
Are these the correct dimensioning for the input parameters?
Is this API supported on Orin Nano?
If this API is not supported, what hardware-accelerated API should I use to alpha-blend two images?
Thanks Dane.
I am trying to blend a PNG onto a camera stream, and this post seems to indicate that is not supported for nvcompositor: NVidia Form Post. In your opinion, is the information in that post still true for the Orin Nano? I briefly tried it and nvcompositor did not seem to recognize the alpha channel in the PNG.
In this case, the PNG is drawn on the video but with no alpha blend.
I will investigate the NvBufSurfTransformMultiInputBufCompositeBlend() API and see what that does for me and will report back.
Thanks again for your assistance.