libargus synchronize timestamps with CLOCK_MONOTONIC_RAW


I’m currently using the camera LI-V024M-MIPI-IPEX30 from Leopard imaging and the Argus API to pull frames off the camera. I have an external IMU (InertialSense) that I’m trying to synchronize with the camera frames. I currently timestamp the IMU data coming off the sensor using:

struct timespec time;
clock_gettime(CLOCK_MONOTONIC_RAW, &time);
auto systemTimeNS = time.tv_sec* 1000000000 + (long)  time.tv_nsec;

and then subtract 4ms for the signal latency.

But I’m having trouble getting the timestamp of when the camera image was snapped from the Argus API to match with the CLOCK_MONOTONIC_RAW timestamp I’m using for the IMU. I’m using the


from the Argus API to get the camera’s timestamp but it seems to be running ahead of the CLOCK_MONOTONIC_RAW timestamp. I tried to also do

frame->getTime() - iMetadata->getSensorTimestamp()

and then subtract the difference to the current CLOCK_MONOTONIC_RAW, but I feel like this is not the right solution. I’m trying to get sub-1ms accuracy on the timestamps between my external IMU and camera. How can I achieve this?

hello moeeab5r,

may I have more details.
for example,

  1. which JetPack release you’re working with
  2. how many cameras you would like to synchronized.
  3. may I know which software approach you’re working with, please refer to below.
  • # [case-1]: multiple sensors per multi sessions launching each camera sensor with different capture request, they're not synchronized.
  • # [case-2]: multiple sensors per single sessions using a single capture request to launch several camera frames at once.

FYI, argus_camera and MMAPI examples were based-on [case-1].

  1. please also check similar topics, such as Topic 1038131, Topic 1056202, and Topic 1046381.
  1. I’m using Jetpack 3.1 r28.1
  2. I’m currently using 1 camera, but it’s not that I want the cameras synchronized together. What I’m trying to do is get the camera’s timestamp synchronized with the kernel timestamp. I want the timestamp I get from the camera to be on the same timescale as when I call
clock_gettime(CLOCK_MONOTONIC_RAW, &time);

. I’m trying to do this because I’m trying to synchronize other sensors that I have connected to the board and I’m timestamping the sensor outputs using the CLOCK_MONOTONIC_RAW in C++.
3 ) I’m just using 1 camera. I’m at the moment not trying to synchronize multiple cameras.

hello moeeab5r,

since you’re working with JetPack-3.1,

  1. you may refer to Topic 1038067 for the patch to fix FE syncpt wait,
  2. and also check Topic 1020202 for the patch to update the timestamp of the video buffer.

or, you could moving to the latest JetPack release, we also had some kernel updates to let CLOCK_MONOTONIC_RAW time close to v4l2 timestamps.
please check the code snippet as below, here’s how v4l2 capture record the timestamp and save it to capture buffer.


static bool vi_notify_wait(...)
    *ts = ns_to_timespec((s64)status.sof_ts);