we have some problems with the synchronization of multiple cameras. The issue occurs with two different hardware setups: A Jetson TX2 with three IMX185 image sensors (L4T 32.3.1 and L4T 32.4.2) and a Jetson AGX with 2 or 3 IMX477 image sensors (L4T 32.3.1). On the TX2 setup, one sensor is running in master mode and the others in slave mode. We checked the synchronization signals using an oscilloscope and they looked fine. On the Xavier we use the MIPI adapter from Leopard Imaging which has a FPGA for synchronization. So from hardware side, everything looks good so far.
On the software side we use libargus for accessing the cameras. All camera devices are attached to the same camera session. The
EGLStream::FrameConsumer is used to retrieve the
EGLStream::Frame objects from each output stream. We read the sensor timestamps using the getSensorTimestamp() method and compare them between the frames. The frames are buffered until the timestamps match. The frames with the same timestamps are passed to the image processing chain. Unfortunately, these frames (which should be captured at the same time because of the same sensor timestamp) are sometimes out of sync. It seems that one image is delayed by one frame. The issue occurs randomly, sometimes the images are perfectly synchronous. If I reduce the frame rate, the synchronization becomes better. I also tried to boost the clocks as described here but this also did not help.
Am I doing anything wrong or is there a bug somewhere in the camera stack?