Is there any one test the delay from sensor to libargus?

i used libargus to get raw data from sensor ,and tranfrom it to nv12 into nvbuf to hdmi.
and i found it cost more than 200ms.
is there any way do reduce the time ?

data from libargus to hdmi cost less than 10ms.

What’s the frame rate? Did you run the nvpmodel and jeston_clocks?
Below is how we test. It’s about 100ms , we run 60fps sensor mode.

hi
i want to know what the argus do during time,even if 100ms is too long.
that closing some operate like setDenoiseMode OFF can work for reducing the cost?

The camera alg need few frame to process so 80 -100 is the best case.

what did argus do ?
i need High real time performance.

argus go through ISP pipeline and run 3A alg and lots of things. You may try the VI mode to capture the raw image and do the bebayer by software.

how to open the VI mode ?

Using vl4 API for it. Like v4l2-ctl tools do.

thanks