How to use NvBufferTransformAsync and NvBufferSyncObjRec

Hi,
I’m using the VIC and Multimedia API to composite and mirror 6 camera sources. I’m using NvBufferTransformAsync instead of NvBufferComposite because that gives possibility to flip. For two cameras, it works. I call NvBufferTransformAsync two times and then call NvBufferSyncObjWait to sync.
When I try to do the same for 6 cameras, my program crashes.

Can you describe how I can use NvBufferTransformAsync for more than 2 cameras?
Are there hardware limitations on how many times you can call NvBufferTransformAsync before a NvBufferSyncObj call?
Can you describe in more detail in what is already at Jetson Linux API Reference: Main Page on limitations and how to use NvBufferTransformAsync and NvBufferSyncObj ?

Thanks a lot!

Hi,
In 6-camera case, do you create NvBufferSession? If not please create one and set to NvBufferTransformParams.

If creating NvBufferSession does not help, we would need your help to share a patch to either jetson_multimedia_api sample so that we can run to reproduce the issue and do investigation.

Hi DaneLLL,
No, I thought it was enough to call the Async version of NvBufferTransform. It works now when I call NvBufferSesssionCreate() and set it to each of the six transform params for each camera.
Thanks for the quick reply and solution!

It would be nice if the Multimedia API documentation could explain what happens in the hardware a bit more. E.g. what is a hardware buffer actually in the Jetson case, compared to a “software buffer”.

Hi,
Thanks for the suggestion. We will check if we can put more information about hardware DMA buffer(NvBuffer) in document.

1 Like