My Jetson Xavier NX uses the built-in imx219 camera and uses the camera_unit_sample routine to preview live streams. I find a delay of more than 100 ms. How can I reduce this delay
hello, Is anybody there
May I ask the ISP of NX how to check the event
Does anyone know what ISPs think about processing time
hello CarmeloBryant,
may I have more details of this.
did you meant the glass-to-glass latency? or, please share the steps how you evaluate the latency.
I used the unittest_samples/camera_unit_sample routine and the desktop display had a delay of more than 100ms. So I want to confirm the ISP(ISP pipeline latency) and CMOS(video capture latency) latency.
hello CarmeloBryant,
that’s normal result for glass-to-glass latency.
there’s Argus FIFO,
from Argus side when ISP is in use, our user-space driver internally takes care of initiating extra 2 captures for sensor exposure programming when argus and hence underneath driver receives first capture request from client.
these 2 internal captures were ignored at driver level and are not sent to argus or client, so this way client receives the same output captures which was requested.
- Does the driver initiate two additional captures for each capture request of the application, or does it do so only for the first capture request of the application?
- these two additional capture requests are not sent to the client, why add these two additional capture requests?
- Is the 100ms delay due to these two additional capture requests? Or ISP pipeline latency+video capture latency plus these two additional capture request times?
hello CarmeloBryant,
as mentioned, these 2 extra captures were for sensor exposure programming.
although sensor would have captured 3 frames, but first 2 frames might be with incorrect exposure settings so it’s dropped.
FYI, there’s latency on the ISP side, it has latency of ~4-13ms depending on the ISP clocks.
Then excluding the ISP and the acquisition program time, there is about 50ms delay, which is the CMOS sensor delay and the delay of abandoning two frames
unittest_samples/camera_unit_sample例程中设置的sensor格式为NV12M,但是我使用例程-o选项将视频保存下来使用YUV工具查看,好像不是NV12格式的
The sensor format set in the unittest_samples/camera_unit_sample routine is NV12M, but I used the routine -o option to save the video and view it with YUV tool, which seems not to be in NV12 format
hello CarmeloBryant,
please submit another new topic since it looks you’re now asking another question instead of capture latency.
How do we change it so he doesn’t lose the first two frames