Keeping camera synchronization at software level

Hi,

we have a system based on a Xavier SoC and a GMSL deserializer to read from two independent cameras. Whenever one of the two cameras is started via Argus API, the camera drivers configure the deserializer to generate a periodic signal for both sensor (frame sync signal). If we open the two cameras, we can check that they are synced at HW level by scoping the “FRAME_VALID” pins. However, we have noticed that sometimes the frames we receive at SW level are not synchronized, as if one video stream were a few frames ahead the other.

After some test, we suppose that this is caused by the lack of determinism in the opening sequence at SW level, so even though the trigger signal start and arrive to both sensor at the same time, the processing pipeline inside the SoC (CSI channels, VI, ISP…) for each video stream is not configured concurrently.

Do you know which should be the steps to follow to configure the cameras in such a way that the frames are always delivered to user space synced? Is there any generic camera sensor driver or documentation to follow to get this feature?

Thanks

Did you try the synSensor sample code?

Hi ShaneCCC,

I think we tried that application before getting cameras synced by HW, but we’ll check it again. Thanks for your feedback.

Please, could you provide a high level description about how this synchronization is kept also at SW level in the case of the synSensor sample? We would like to understand if we are still missing something in the camera sensor driver we have.

Hi,

Let me clarify my previous comment. The thing is that we already know and have checked tho there is HW synchronization between two cameras. However, when we read from the application we detect that sometimes one video stream is ahead the other for a certain number of frames. The actual question would be if there is a a way to indicate to ARGUS to keep this synchronization when delivering the frames to the application. If so, is there any feature or procedure to follow in the sensor driver?

My understanding is that the sample application you indicate is meant to provide a kind of sw-based synchronization workaround. Is this correct?

In few words, the objective is to ensure that the application is reading two synchronized video streams without checking the timestamps.

Thanks!

1 Like

hello alejandro.concepcion,

syncSensor sample is a software approach of software based synchronization,
this example using multiple sensor per single capture session method, which means it’ll duplicate single capture request to these camera sensors.
hence, you should expect these two capture results to be close enough.
in addition, you may also refer to getSensorTimestamp() to gather sensor hardware timestamp, please have implementation of timestamp comparison from user-space to achieve synchronization results.
BTW,
please also check similar discussion threads, such as Topic 1061467, and Topic 1007933 for reference.
thanks