Synchronizing hardware to a CSI camera using libargus

Hi,

I am trying to synchronise a separate piece of hardware with frame capture on an IMX219 CSI camera. I am using the Leopard imaging LI-JXAV-MIPI-ADPT-4CAM along with the IMX219 camera (+ adapter) on an AGX Xavier with 32.3.1 kernel. I have a function which interacts with my external hardware that I want to trigger between frame captures on the IMX219. I’ve tried using the libargus capture(request) function, and then running my external hardware function in between capture(request) calls, but this is too slow, as the capture(request) call does an entire MIPI stream initialization, (capture,) and shut down cycle, taking about 200-400 ms.
Unfortunately the IMX219 doesn’t have a hardware trigger, which would probably solve my problem.
To try and get the fastest response from the IMX219 the only viable option is to use the repeat(request) function. One solution that I could see working with this is to call my function between EOF and SOF flags of the CSI-2 transfer of the image. I am wondering if there is a flag or function in libargus that that would either let me call my function after EOF of the CSI-2 transfer, or a flag that lets me know when this occurs?

Alternatively, is there any way I can modify the request object to have a function trigger anytime a request is executed, or finished? To me it doesn’t seem like there’s a way to do this using the request object as the requests are handled in the background of the libargus library. Is there any function or flag that I could use in libargus to do this?

If this cant be done using libargus, can this functionality be added at the driver level for the camera?
If it’s a driver level change would this be something that I’d need to talk to Leopard Imaging about or is there general documentation I could use
to do this?

Thanks for any help.

Have a reference to below topic for the sync frame implement.

Thanks for the speedy response Shane.

I’ve read through the syncSensor examples and the posts regarding using the timestamp to synchronise data after the fact, however I’m not sure if they’ll help in my case, or if I’m just misunderstanding how they work.

My understanding of the syncSensor example is that it allows me to synchronize two separate camera producers so that they capture frames at the same time (though based on some people’s experience it would appear it doesn’t always work). This method appears to do the synchronization in the libargus background - it doesn’t explicitly allow recognition of when the hardware has finished transferring data on a frame. I need to execute my function between frame data transfer from the hardware. Is there something I’m missing in this example that would allow me to call my function explicitly after hardware has finished transferring a frame?
I’ve tried executing my function directly after an aquireFrame() call (when I know that the libargus internal buffer is empty so it should be returning the frame just sent from the camera) but there’s still a delay between when libargus signals that a frame is on the queue and when the hardware has explicitly finished transferring a frame.

For example, the closest thing I could find that seems like it might do what I’m looking for is the EGLSync Interface, but I’m not sure if this is the intended purpose of this interface, or if this would just leave me at the same point. It’s also possible that the level of hardware interaction I’m looking for isn’t available through the libargus framework.

Also the timestamp methods wont work as these are for synchronization via post-processing.

The syncSensor sample need the sensor have HW sync design.
Due to the argus have internal buffer to capture it could have chance get the none sync frame that is why need timestamp checking to make sure to sync the frames. Once it sync it should be always sync.