Syncronizing frame capture with imu sensors

I am using a Jetson Xavier NX to capture frames from a USB camera using OpenCV. Then I implement a KCF tracker on the captured frames to track a certain object. In parallel I am sending the tracking data to an ESP32 capturing IMU data from an MPU6050. I can see a lot of inconsistency between when the tracker recorded movement to when the sensor recorded movement. sometimes it is 70 millisecond difference, and sometimes it even gets up to 150 millisecond difference. that much difference is really messing up my plans, even that 70 millisecond difference seems too much considering i am running on 60fps camera, that amounts to almost 5 frames delay. I checked the network delay with a scope, it is about 3 millisecond consistent using UART communication. Here’s how I am grabbing frames:

video.open(
  "v4l2src device=/dev/video0 queue-size=1 ! video/x-raw,width=640,height=480 ! nvvidconv flip-method=2 ! video/x-raw(memory:NVMM), 
  format = I420 ! nvvidconv ! video/x-raw,format=BGRx,width=(int)640, height=(int)480, framerate=(fraction)60/1! 
  videoconvert !video/x-raw,format=BGR ! appsink drop=True sync=False"
);

I use cv::VideoCapture::read() to capture frames.

the IMU sensor is working at 200hz.

Can anyone explain both the delay in measurements, meaning between the sensor and the tracker, and the difference between measured delay, ranging from 70 to 150 milliseconds?

I’ll add a second question as well- how do i check the camera timestamp? i know v4l2 driver offers hardware timestamp, using mmap, but how do I implement it in a cpp/opencv code?, would appreciate any suggestion, even change the way i capture frames.

hello Elad-SH,

since you’re using USB camera, it’s go through VI driver and processed by VB2.
there’ll be vb->vb2_buf.timestamp to record the start-of-frame timestamp.
could you please analysis the timestamp to check the tracker/sensor recorded movement.

BTW,
for synchronization use-case.
you may also have system level configuration to alter the higher priority of running processes.
for example, $ renice [-n] <priority> [-p|--pid] <pid>.

while usb camera does use v4l driver, I am grabbing frames using opencv, and opencv doesn’t have clear access to buf.timestamps. I am not sure i want to change the opencv code, or change the way i grab frames to V4L but i am not sure i have a choice. either way i dont have direct access to the driver without making major changes.
I will try to alter the priority of the program and check if it will work, but i am running on about 80-90% CPU atm so i am not sure a priority change can help too much.

hello Elad-SH,

that’s v4l2_buffer struct for video buffer info,
you may refer to the struct and modify the driver code to access frame timestamp.
for example, $public_sources/source/public/kernel_src/kernel/kernel-5.10/include/uapi/linux/videodev2.h

hello jerry thank you for the responses, it seems the problem was with the way i was capturing the frames.
Doing cv::VideoCapture(…,CAP_GSTREAMER) and cv::VideoCapture(…,CAP_V4L2) are very different both in the way the camera initialises and in the modules opencv uses on the camera. using get(cv::CAP_PROP_POS_MSEC) when capturing with gstreamer vs capturing with v4l2 gives very different results.
V4L2 capture returns the monochromatic clock of the jetson of when the frame was taken from the camera, and gstreamer returns software timestamp of when the program received the frame, and the timer starts when the first capture happens, not relative to any clock.

Now i just need to figure out how to initialise the camera using CAP_V4L2 with proper settings and run with it.

hello Elad-SH,

please see-also Camera Architecture Stack, libargus and v4l2src are using different pipelines.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.