I’m using the VIC and Multimedia API to composite and mirror 6 camera sources. I’m using NvBufferTransformAsync instead of NvBufferComposite because that gives possibility to flip. For two cameras, it works. I call NvBufferTransformAsync two times and then call NvBufferSyncObjWait to sync.
When I try to do the same for 6 cameras, my program crashes.
Can you describe how I can use NvBufferTransformAsync for more than 2 cameras?
Are there hardware limitations on how many times you can call NvBufferTransformAsync before a NvBufferSyncObj call?
Can you describe in more detail in what is already at Jetson Linux API Reference: Main Page on limitations and how to use NvBufferTransformAsync and NvBufferSyncObj ?
Thanks a lot!