We want to measure camera latency and analyze proportion of latency in each processing stage using Jetson TX2 + OV5693 camera.
Previously, we have measured glass-to-glass latency using gstreamer nvarguscamerasrc and got the results as follows
- 1280x720, 30fps = about 91ms, 1280x720, 120fps = about 183ms (using same resolution and difference frame rate)
30fps is about 2.7 frames delay, but 120fps is not 8.3ms x 2.7 frames = 23ms but 22 frames delay (8.3ms x 22 frames = 183ms). We don’t have much experience and intuition about analyzing these results, so we want to analyze these results by obtaining the precision timestamp of each processing stage using Jetson TX2.
We use JetPack 4.5.1, L4T 32.5.1, OV5693 camera.
The detailed questions are as follows
- How to get timestamp in each stage of following flow? (camera → CSI interface → buffer → ISP → memory in → memory out → HDMI interface → display)
- For this, I think that the timestamp needs to be added in kernel and application, then which application is easier to analyze? using multimedia API?
- And how to monitor the timestamp of ‘target frame’, e.g., N-th frame, in kernel and application layer?