Camera synchronization problem


we have some problems with the synchronization of multiple cameras. The issue occurs with two different hardware setups: A Jetson TX2 with three IMX185 image sensors (L4T 32.3.1 and L4T 32.4.2) and a Jetson AGX with 2 or 3 IMX477 image sensors (L4T 32.3.1). On the TX2 setup, one sensor is running in master mode and the others in slave mode. We checked the synchronization signals using an oscilloscope and they looked fine. On the Xavier we use the MIPI adapter from Leopard Imaging which has a FPGA for synchronization. So from hardware side, everything looks good so far.

On the software side we use libargus for accessing the cameras. All camera devices are attached to the same camera session. The EGLStream::FrameConsumer is used to retrieve the EGLStream::Frame objects from each output stream. We read the sensor timestamps using the getSensorTimestamp() method and compare them between the frames. The frames are buffered until the timestamps match. The frames with the same timestamps are passed to the image processing chain. Unfortunately, these frames (which should be captured at the same time because of the same sensor timestamp) are sometimes out of sync. It seems that one image is delayed by one frame. The issue occurs randomly, sometimes the images are perfectly synchronous. If I reduce the frame rate, the synchronization becomes better. I also tried to boost the clocks as described here but this also did not help.

Am I doing anything wrong or is there a bug somewhere in the camera stack?

Best regards

I am no familiar with the IMX185 but some Sony sensors implement synchronization in a way that might not be expected. The method I have seen is that the sensors run in a free-running mode and the frame sync signal ‘realigns’ the clocks. When the sensor is too far from the sync pulse it will realign but that can incur a penalty. Perhaps that is what you are seeing. With sensors from other manufacturers the sync signal triggers an exposure. Again, I’m not familiar with the IMX185, but this may be what you are seeing.

Hi D3_growe,

on the IMX185, we feed the slave sensors with the H/V-Sync signals (XHS and XVS) from the master sensor. This seems to work quite well, because some images are perfectly synced.
Also the timestamps from the sensor metadata are equal, so I think the Jetson receives the images at the same time. But somehow the image buffer does not contain images recorded at the same time point.

Ok, I did some further investigation: It seems that the synchronization problem exists until I change any camera parameter. After changing for example the exposure compensation, the synchronization looks good. So maybe this is a bug in libargus?

I realized that there is is no need for changing a parameter. Just restarting the capture session (calling stopRepeat->waitForIdle->repeat) is enough. It’s weird that it doesn’t work on the first run. Maybe something goes wrong during initialization inside libargus?

hello MarkusHess,

there’s sensor register programming during sensor initialization, all user-space settings also applied after stopRepeat() function called.
you may also refer to Multimedia API Reference for description of ICaptureSession.

Hi JerryChang,

I add three camera devices to the same capture session, configure the capture settings and start the session. Then I read the images using the EGLStream::FrameConsumer. I also read the sensor timestamp from the metadata to make sure that the timestamps are equal. Nevertheless, even the sensor timestamps for all frames are equal, at least one image is captured at a different time point. It seems to be exactly one frame delayed (we capture with 30fps and one image is delayed ~0.03s). It looks for me like a bug in the camera stack. Frames with the same timestamp should contain images captured at the same time point.

As I wrote, it seems that restarting the capture session helps but I am not sure if this works reliable.

hello MarkusHess,

may I know had you set device to enter performance mode for verification.
you may refer to Supported Modes and Power Efficiency and configure as MaxN for testing,

Hi JerryChang,
yes it also happens with MaxN. I also boost the clocks as described in the wiki:

sudo su
echo 1 > /sys/kernel/debug/bpmp/debug/clk/vi/mrq_rate_locked
echo 1 > /sys/kernel/debug/bpmp/debug/clk/isp/mrq_rate_locked
echo 1 > /sys/kernel/debug/bpmp/debug/clk/nvcsi/mrq_rate_locked
cat /sys/kernel/debug/bpmp/debug/clk/vi/max_rate |tee /sys/kernel/debug/bpmp/debug/clk/vi/rate
cat /sys/kernel/debug/bpmp/debug/clk/isp/max_rate | tee  /sys/kernel/debug/bpmp/debug/clk/isp/rate
cat /sys/kernel/debug/bpmp/debug/clk/nvcsi/max_rate | tee /sys/kernel/debug/bpmp/debug/clk/nvcsi/rate

This also does not help.

hello MarkusHess,

may I know is it possible to reduce number of cameras to two for checking synchronization issues?
had you also enable single capture session for multi-sources. i.e. Argus/samples/syncSensor

Hi JerryChang,

yes, i had done this before. With two cameras, the synchronization gets better on the Jetson TX2 with the IMX185. But on the Jetson Xavier AGX with the IMX477 cameras, the images are still out of sync. Note that the IMX477 has a higher resolution than the IMX185.
With three IMX185 cameras running on the Jetson TX2, it seems that two of them are synchronous and the third one one frame delayed. The cameras are running in a single capture session.

hello MarkusHess,

one more question,
could you please share the hw connections of these multi-camera.
is there camera devices sharing CSI bricks, or each of camera were using individually CSI bricks.
for example, please check Port Index session for the diagram, and share the details.

Hi JerryChang,

we use 4 lanes per camera:
Camera 0 -> CSI A, B
Camera 1 -> CSI C, D
Camera 2 -> CSI E, F