We are using nvarguscamerasrc (jetpack 4.5.1) to get the timestamp of our images (via a gstream pad probe). We have 6x imx264 global shutte cameras (leopard imaging mipi) running at 24.6 FPS.
As far as we understand the clock is based on CLOCK_MONOTONIC and we convert that clock to CLOCK_REALTIME.
We would like to understand what that time exactly represents because we seem to be experiencing about 40-60ms delay between our measurement and what we see in a synchronized IMU.
Is that time the time the sensor starts sending the image? For a global shutter camera like the IMX264 does that mean that would be just past the end of the exposure?
Also as we understand it there is a three image pipeline between the sensors and when an image arrives in argus. Image 1 is the image currently being received, image 2 is the buffer, and image 3 is the image being processed in the ISP. Is all of this buffering captured by the image timestamp or do we need to subtract three images worth of timestamps?
In the end the timestamp we want is the timestamp at the middle of the exposure of the image.