Synchronizing IMU and Camera via a start_frame or trigger

Hi, I am working on a robot. I plan on having an MCU interact with an IMU and some wheel encoders and it’s important to be able to sync this information with the start of a camera frame. Basically, I either need to slave a camera trigger to the MCU or I need the MCU to be able to register when a frame starts being recorded.

There are a number of color cameras for the PI that offer this type of functionality:

ex:
Amazon.com: Arducam Lens Board for Raspberry Pi Camera, Adjustable and Interchangeable Lens M12 Module, Focus and Angle Enhancement for Raspberry Pi 4/3/3 B+ : Electronics

and:

Buy a Raspberry Pi High Quality Camera – Raspberry Pi

Which have official or unofficial Jetson Nano drivers, these camera modules also provide additional input/output pins in regards to vsync and XVS which should be able to be used to indicate the start of a capture frame.

Do these existing nano drivers allow you to program the modules to either 1. use an external trigger or 2. indicate the start of a frame grab?

Synchronization is a huge, massive problem to using these systems,

hello lowellm,

you should refer to VI drivers,
it uses sync-points to communicate with camera devices to program the buffer for the sensor frames,
for example,
$L4T_Sources/r32.6.1/Linux_for_Tegra/source/public/kernel/nvidia/drivers/media/platform/tegra/camera/vi/vi2_fops.c

        chan->capture_state = CAPTURE_GOOD;
        for (index = 0; index < valid_ports; index++) {
                err = nvhost_syncpt_wait_timeout_ext(chan->vi->ndev,
                        chan->syncpt[index][0], thresh[index],
                        chan->timeout, NULL, &ts);

note,
there’s ring buffer in the VI driver side,
it’ll release buffer N at N+2 frame start event.
for example,
$L4T_Sources/r32.6.1/Linux_for_Tegra/source/public/kernel/nvidia/drivers/media/platform/tegra/camera/vi/channel.c

void tegra_channel_ring_buffer(
...
        /* release buffer N at N+2 frame start event */
        if (chan->num_buffers >= (chan->capture_queue_depth - 1))
                free_ring_buffers(chan, 1);

hence,
you may add some implementation in the VI driver side to send the signal to sync with the IMU.

Could you provide more information? Do the camera interface pins support generating a timing pulse or trigger?

If this is only doable through software, what is the best way to time the start of the frame and pull in UART or I2C data immediately?

no, it’s not supported.

the approach would allocate a buffer to store those IMU data with timestamps, and you should gather camera sensor frame timestamps to synchronize with them.
please also refer to Topic 159220, it shows an example to get the SOF timestamp within VI driver.

Hmm, will the jetson drivers ever support this functionality? Hardware wise it’s doable as it’s an option on the Pi, it’s just an issue of driver development(which I have no knowledge of).
Are there any global shutter cameras with an external trigger that run on the jetson?

How do you timestamp an I2C or UART read?

hello lowellm,

you’ll need have implementation to receive the IMU data from kernel side, then you’ll have kernel timestamp with the samples.

you may contact with Jetson Camera Partners for the available camera sensors.