Nvarguscamerasrc Timestamp

Hi,
We are using nvarguscamerasrc (jetpack 4.5.1) to get the timestamp of our images (via a gstream pad probe). We have 6x imx264 global shutte cameras (leopard imaging mipi) running at 24.6 FPS.

As far as we understand the clock is based on CLOCK_MONOTONIC and we convert that clock to CLOCK_REALTIME.

We would like to understand what that time exactly represents because we seem to be experiencing about 40-60ms delay between our measurement and what we see in a synchronized IMU.

Is that time the time the sensor starts sending the image? For a global shutter camera like the IMX264 does that mean that would be just past the end of the exposure?

Also as we understand it there is a three image pipeline between the sensors and when an image arrives in argus. Image 1 is the image currently being received, image 2 is the buffer, and image 3 is the image being processed in the ISP. Is all of this buffering captured by the image timestamp or do we need to subtract three images worth of timestamps?

In the end the timestamp we want is the timestamp at the middle of the exposure of the image.

Regards

Sebastian

The timestamp is the SOF(start of frame) times.

Hi,
Thank you for the quick reply. We don’t seem to be seeing the actual start of frame in our timestamp. Instead we seem to see a 1-2 frame delay in the time stamp.

I can see how for a rolling shutter camera the timestamp would match the start of frame since the camera would start sending data right away. In our case the camera is a global shutter camera so I don’t think it can send data before a frame is complete.

How would libargus know when the start of the frame is?

Sebastian

The sensor sensor SOF package and NVCSI/VI receive it and make the timestamp.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.