I am bringing up a Stereo Visual Inertial Odometry System and the CSI camera sensors are externally triggered. However, I observed a delay/latency > 30ms since the moment applying the triggering signal to the moment I could get the frames. This is contrary to my initial thought as Libargus seems to be part of low-level APIs.
The cameras are global shutter (Sony IMX296) and the trigger pulse is very short (200us) for short exposure. The triggering frequency is 20Hz.
Thanks for your reply. However, we would not stream the videos out but process (fuse them with IMU and other data) inside the Jetson. In terms of timestamp, we would like the frames to be as close to the external triggering signal as possible. Is there anyway to reduce the latency, please ?
If you have critical request for the latency I would suggest using v4l2 API instead of argus due to argus have many pipeline that would impact the latency.
Should it be the following example : /usr/src/jetson_multimedia_api/samples/12_v4l2_camera_cuda/ or /usr/src/jetson_multimedia_api/samples/18_v4l2_camera_cuda_rgb/? Also, should I change camera node in the device-tree from
I do not need YUV output but RGB/RGBA and eventually grayscale as the sensor is monochrome. Is it possible to do with GPU (CUDA) only and to bypass the ISP (Libargus) ?