syncSensor with four CSI-2 cameras on L4T 32.7.2

Hi,

I’m trying to synchronize four CSI-2 cameras on a Xavier NX running L4T 32.7.2. I can only get the example to work with three cameras, if I add the fourth the output from all cameras/devices will be images from the first camera.

Are you able to provide files for synchronization of four cameras for 32.7.2 similar to what was done in the forum post below?

Thank you!

Hi,
This will available on r32.7.3

Thanks

Alright, that is great to hear!
As there is probably a while until r32.7.3 is released and I would have to wait for my carrier board manufacturer to make a BSP for that version before I can start using it, is there any way to provide libraries for r32.7.2 that supports synchronization of four sensors?
If not, do you have any ETA on r32.7.3? My system is almost ready, I just lack proper synchronization of the last sensor.

Oh and one more thing. I need good synchronization because I am doing panorama stitching from the different camera streams and I would rather have exposure etc. be uniqe to each camera. Is there any way to achieve this using a single capturesession?

R32.7.3 will be available until September’22
Before this release I would suggest to use r32.4 with the libs from the topic you post.

Unfortiounately our cameras are only supported by L4T r32.7.2, r32.7.1 and r32.6.1 BSPs. If you are able to supply libraries for either of those releases that would fix our problem.

Also, is there a way to have exposure settings etc. not be shared across cameras with a single capturesession? As our cameras have very different light conditions while require good synchronization this is another problem.

Thank you for your help!

Sorry, I don’t understand the question well.
Could you give more detail.

Hi,
Please replace with attached libs and try your use-case:
r32_7_TEST_argus_multi_sources.zip (2.6 MB)

Please backup the original libs before the replacement just in case.

In the SyncSensor example the synchronized cameras are setup to share autocorrection-settings like hue and exposure and be based on the first sensor that is added. I have tried a lot of stuff to make these individual to each sensor but this far havent been able to.

This makes sense for a stereo vision setup where cameras are mounted in the same direction but it’s not good for our usage as some cameras are facing in different directions.

This is the code in the example;
‘’’
Camera

// Create the capture session, AutoControl will be based on what the 1st device sees.
UniqueObj<CaptureSession> captureSession(iCameraProvider->createCaptureSession(lrCameras));
ICaptureSession *iCaptureSession = interface_cast<ICaptureSession>(captureSession);
if (!iCaptureSession)
    ORIGINATE_ERROR("Failed to get capture session interface");
...

‘’’

Thank you!
I won’t be able to test this until monday but I will let you know if this works.

Does not fix the problem I am afraid.

The same thing happens with these libraries; When I run the modified syncSensor sample with four cameras all the videostreams show the output from the first camera. With three cameras it runs stable at 20 FPS with each camera displaying its own videostream. I have tested the four-camera setup at 5, 10 and 20 FPS without that fixing the problem.

Hi,
Your issue looks different. If it hits the limitation, it fails to launch the camera and some error prints are shown. But you can launch the cameras. Does it work if you launch the 4 cameras in gstreamer command? Would like to confirm it is specific to running syncSensor

Thank you for the quick response!
Yes, previously I have used the opencv VideoReader-class to stream from all four cameras into opencv using gstreamer. I have also had four individual camera streams via gstreamer and I am able to run the 13_multi_camera example with all four cameras.

Perhaps this helps in debugging the issue;
I just tried running two instances of the syncsensor example with two cameras in each instance. This works fine, and I get the correct output from all cameras. This only gives me software synchronization of two and two cameras though.

Instance 1: /dev/video0 (master) & /dev/video1
Instance 2: /dev/video2 & /dev/video3

There is no update from you for a period, assuming this is not an issue any more.
Hence we are closing this topic. If need further support, please open a new one.
Thanks

Hi,
Please share a patch to syncSensor so that we can reproduce it and check further. As a quick solution please run two instances of syncSensor.