When getting a timestamp from libargus with Argus::ICaptureMetadata::getSensorTimestamp or EGLStream::IFrame::getTime, what clock domain are the timestamps in? can they be compared to linux’s CLOCK_MONOTONIC system timestamps, for instance? if they’re in a different clock domain, how can I rectify these timestamps against wall time?
assume you’re working with l4t-r32.2,
there’s offset_ns device node, which is used to calculate the capture timestamp from system time.
it’s based-on below formula,
clock_gettime(MONOTONIC_RAW) = cycle_ns(TSC) - offset_ns
BTW,
according to Camera Architecture Stack, there’re different software blocks to access camera sensors.
you may also refer to Topic 1057229 and apply the fix of missing VI buffer’s timestamp.
thanks
Is there any way to access the TSC corresponding to the Argus timestamp? Otherwise, is there any way to access the cycle_ns frequency and the offset_ns so that we can convert back to TSC? Are these values constant once the Xavier boots up?
Where could we see the actual code for this? We are seeing about a 160ms delay(about 6 frames) in the timestamp that is reported to us. Is there some internal buffering in argus? Is there a way to set the buffer size?
you may download L4T Multimedia API via download center, for the sources to access camera sensor.
for example, $l4t-r32.2/public_sources/tegra_multimedia_api/argus/samples/yuvJpeg/*
you may use below function call to get sensor timestamp,
iCaptureMetadata->getSensorTimestamp()
this return the kernel timestamp of sensor start-of-frame, and you don’t need to add offset_ns offsets for compensation.
please also note that, the unit of sensor timestamp is based-on nano-seconds.
thanks
Can you confirm that Argus is using MONOTONIC_RAW instead of MONOTONIC? The Argus documentation PDF says all times are from the monotonic clock. In addition, when we ran tests, the Argus timestamp was around 200 ms ahead of monotonic raw but 100 ms behind monotonic, which also seems to suggest it is using the monotonic clock.
Argus is using the timestamp of RTCPU, it’s a dedicated processor for camera management.
you might also access the Tegra X2 (Parker Series SoC) Technical Reference Manual, check [Figure 1: Parker Processor Block Diagram] for details.
thanks
Is there a way to query this clock and/or its offset relative to MONOTONIC and/or MONOTONIC_RAW from a C program? I need to convert the sensor timestamps into a common time domain as used by the other sensors.
I’m still a bit confused. I’m getting the timestamps using
iCaptureMetadata->getSensorTimestamp()
which you state above is kernel time and I do not need to add offset_ns to compensate. Is this timestamp from the MONOTONIC or MONOTONIC_RAW clock, and if it’s neither of them, what offset do I use to convert it?
you should consider iCaptureMetadata->getSensorTimestamp() function call from Argus API return the kernel timestamp of sensor start-of-frame.
you may also execute the argus sample, argus_yuvjpeg for checking the timestamp.
please also note that,
the unit of sensor timestamp is based-on nano-seconds.
for example,
sorry for late reply, I did not receive system notification.
instead of asking a follow-up question in the discussion thread which already mark as solved. also, this one is an old thread consider to closed since Nov/2019.
I would suggest you initial another new discussion thread next time, for the better supports.
below capture the start-of-frame timestamp of the sensor signaling, Argus::ICaptureMetadata::getSensorTimestamp
while EGL stream capture the timestamp when it rendering the frame to display. it’s close to end-of-frame of the sensor signaling, EGLStream::IFrame::getTime
FYI,
we had some changes for Xavier to plumbed up the hardware timestamps into camera software stack.
there’ll be start-of-frame and end-of-frame sensor timestamps include in metadata from TSC hardware, by using new interfaces for TSC HW timestamp
you may expect those changes will be include to next public release, i.e. JetPack-4.5 / l4t-r32.5.
thanks