we have a system based on a Xavier SoC and a GMSL deserializer to read from two independent cameras. Whenever one of the two cameras is started via Argus API, the camera drivers configure the deserializer to generate a periodic signal for both sensor (frame sync signal). If we open the two cameras, we can check that they are synced at HW level by scoping the “FRAME_VALID” pins. However, we have noticed that sometimes the frames we receive at SW level are not synchronized, as if one video stream were a few frames ahead the other.
After some test, we suppose that this is caused by the lack of determinism in the opening sequence at SW level, so even though the trigger signal start and arrive to both sensor at the same time, the processing pipeline inside the SoC (CSI channels, VI, ISP…) for each video stream is not configured concurrently.
Do you know which should be the steps to follow to configure the cameras in such a way that the frames are always delivered to user space synced? Is there any generic camera sensor driver or documentation to follow to get this feature?