We’re running Jetpack 4.6 with l4t 32.6.1, and sometimes seeing getSensorTimestamp returning a time a few seconds offset from CLOCK_REALTIME. Normally, getSensorTimestamp is aligned with CLOCK_MONOTONIC. Is there a setting that would cause this? Below is an example of the timestamps when this is happening:
sof = 1648740421.0264 => iMetadata->getSensorTimestamp()
boot = 58944.4308 => clock_boottime
monor = 58944.4276 => clock_monotonic_raw
mono = 58944.4308 => clock_monotonic
real = 1648740424.8134 => clock_realtime
exp = 0.0080 => iMetadata->getSensorExposureTime()
softsc = 58951.631644 => iSensorTimestampTsc->getSensorSofTimestampTsc()
eoftsc = 58951.647736 => iSensorTimestampTsc->getSensorEofTimestampTsc()
offset = 7245379424 => /sys/devices/system/clocksource/clocksource0/offset_ns
The (tsc - offset) = CLOCK_MONOTONIC_RAW relationship still looks correct. We could switch to using the Tsc timestamps, but would prefer to stick with getSensorTimestamp() if possible. What function is used to get the timestamps reported by getSensorTimestamp? ktime_get_ts64?