Meauring Latency of CSI Camera from Capture to RAM


I’ve been searching for some time but can’t quite find a good way to measure the latency of a CSI camera. Can anyone hel out? I’m trying to get the latency between a frame being captured (i.e end of exposure time), and the frame being available in RAM.

I’ve seen ways to measure glass-to-glass latency with monitors, but the pixel response time of monitors can vary widely, therefore this is not an ideal way to measure for me.

Any help would be very appricated!

Edit: I found that ./argus_camera --kpi=1 will output a latency. What latency does it measure?

The argus --KPI show the fps instead of the latency.

hello alexisguiter,

here’s the camera capture pipeline, sensor → CSI → VI
may I know what’s your requirement for checking such latency when frames being available in RAM.

Ideally I want to get a Latency of 30ms or less. I want to test pipelines with Gstreamer, as well as Libargus on it’s own (in C++).

I’m also testing Monochrome cameras which don’t need the ISP for debayering.

hello alexisguiter,

it depends-on the senor types, what’s the monochrome sensor you’re used.
suggest you please having another forum thread for tracking this.

hello alexisguiter,

there’s sync-point mechanism to have communicate with sensor hardware and software issue capture request.
it’s used start-of-frame signaling to indicate a frame. so, if you’re having sensor frame-rate at 30-fps, the ideally latency to program one frame would be 33ms (i.e. 1/30) between frame-N and frame-N+1.

when you’re enable the --KPI commands to check the latency. it’s evaluate the latency from start-of-frame and rendering to display.
so, you may expect the result here would greater than frame programming time.

Thank you for the explaination. To confirm, Start-of-Frame is the time when Libargus requests a frame? i.e the start of exposure?

hello alexisguiter,

yes, that’s correct.