How to acquire images from stereo cameras simultaneously?

Usage description:

We use two cameras with synchronized exposure as stereo camera, as shown in the figure below.

We need:

  • Acquiring frames simultaneously, with strict time synchronization.
  • Acquiring frames as quick as possible, with minimum delay.

Problems

We have tried 2 different method, but both encoutered problems…

  • Method 1: Acquire left and right camera using different threads.

    • There’s 2-4ms delay between left and right frames acquired, which will affect the total acquiring delay.

    • After each frame been acquired, there’re 4700+ thread switching.

    • More threads are created by IFrameConsumer::AcquireFrame().

      • For 2-thread acquiring, 13 threads are created.
        nvidia@localhost:~$ ps -e -T |grep mtcam0
        7867  7921 ?        00:04:43 mtcam0
        7867  8100 ?        00:00:00 mtcam0
        7867  8164 ?        00:00:00 mtcam0
        7867  8200 ?        00:00:05 mtcam0
        7867  8228 ?        00:00:00 mtcam0
        7867  8470 ?        00:01:46 mtcam0
        7867  8471 ?        00:01:42 mtcam0
        7867  8472 ?        00:01:41 mtcam0
        7867  8473 ?        00:01:41 mtcam0
        7867  8474 ?        00:01:42 mtcam0
        7867  8480 ?        00:00:00 mtcam0
        nvidia@localhost:~$ ps -e -T |grep mtcam1
        7867  8318 ?        00:00:48 mtcam1
        7867  8334 ?        00:00:03 mtcam1
        7867  8362 ?        00:00:00 mtcam1
        7867  8521 ?        00:00:00 mtcam1
        
      • For single thread acquiring, only 7 threads are created.
        ps -e -T |grep mtcam
        7788  7830 ?        00:11:48 mtcam
        7788  7972 ?        00:00:00 mtcam
        7788  7987 ?        00:00:00 mtcam
        7788  8008 ?        00:00:27 mtcam
        7788  8060 ?        00:00:02 mtcam
        7788  8066 ?        00:00:01 mtcam
        7788  9103 ?        00:00:06 mtcam
        7788  9105 ?        00:00:00 mtcam
        
  • Method 2: Acquire 2 cameras in single thread.

We can’t find suitable API to ensure synchronization of acquired images.
For example, if left camera has 1 frame and right camera has 2 frames(When left camera dropped 1 frame).

Have a reference to this topic.

Topic 1070823

As explained by JerryChang at this topic:

There’re two use-case:

  • [case 1] multiple sensors per multi sessions.

  • [case 2] multiple sensors per single session.

In our application, case 2 is used, just like syncSensor samples.

But instead of acquireFrame() sequentially in single thread, we call acquireFrame() in two threads separately.

Does this modification have any impact?

Thanks.

hello superlvjf,

please using single session approach if you’re looking for precise timestamp for dual cameras.
you should have hardware pin connected for these two camera modules, (i.e. hw sync), and you should enable the API getSensorTimestamp(), to gather sensor hardware timestamp; please have implementation of timestamp comparison from user-space to achieve synchronization results.

besides syncSensor, there’s syncStereo sample to analyze frames from 2 synchronized sensors.
thanks

We have rewritten the code according multiple sensors per sigle session use-case.

But occasionally the program will block on AcquireFrame() and can’t obtain any frames. And there’s no error message in dmesg and libnvargus.

Please help how to check the reason. For example, how to turn on detailed log of nvargus_daemon?