I am working on a project with a number of different sensors connected to a Jetson Orin AGX and I really want to have them all provide timestamps synced to the same time domain.
The issue I have is that seems like tegra camera system via Argus provides video frames with TSC monotonic timestamps and all our other systems (like lidars) provide timestams synced via PTP to a Linux Realtime time domain.
So, I am looking for a way to somehow synchronize timestamps I get from cameras to a real time Linux clock. As far as I use Jetpack 6.2 I have found this code snippet somewhere on this forum:
Are there any actual ways to get all my sensors synchronized and use one clock domain, or the only possible way to use this calculated offset with more than 1ms jitter?
Thank you for your reply, could you please provide me more details about what you mean by “HW need sync design first”? Is it like an external trigger for lidars/cameras etc?
Thanks for your reply.
I have just noticed an interesting behavior I can not explain:
I have two Orin AGX with 6.2 jetpack and when I run the program I attached previously on one of the Orins I get an offset deviation is only about 1-2us
However the same measurement on the second Orin gives me >100ms deviation.
I actually could not find an issue or any difference between this Orins, could you please give me a clue what might be going on here?
And one more question after I carefuly read this topic one more time.
Did I get right, now I can use Argus::ICaptureMetadata::getSensorTimestamp to get a CLOCK_MONOTONIC_RAW frame timestamp and I shouldn’t use and additional calculations from TSC HW time to linux system monotonic time?
Does Argus make this conversion by itself?
Customize nvarguscamerasrc by adding auto ts = iMetadata->getSensorTimestamp(); to StreamConsumer::threadExecute. So as far as I understand ts here is a TSC Hardware timestamp.
Get current TSC HW timestamp via reading cntfrq_el0 and cntvct_el0 regs as it was shown in the referenced post.
Calculate the difference between current timestamp and ts and get a value ~31s
If iMetadata->getSensorTimestamp() provides TSC HW timestamps the difference between it and the current TSC HW time must be much less and equals about 20ms I guess.
However, when I make the same experiment but now I obtain current timestamp via clock_gettime(CLOCK_MONOTONIC_RAW, &real) the difference is about 20ms what makes me think that getSensorTimestamp() actually provides Linux Kernel CLOCK_MONOTONIC_RAW time.
Could you please clarify this for me. I use 6.2 Jetpack btw