More than two camera devices in one capture session won't work

I have fourIMX568 (2x mono and 2x color) on an Jetson Orin NX on a “custom” (Forecr DSBOARD ORNX) carrierboard.
I have a setup where I have a vector of camera devices in one capture session, that create BufferOutputStreams, from which we can read EGL Images.
For two cameras that works perfectly fine, each combination of these four cameras, as long as they are two, result in two outputstreams, that make sense.
If I try to capture three devices in one capture session, I cannot acquire the buffers anymore and if I try to capture four devices, i get back four times the image of the first camera device. How can this be?
Is it possible to have more than two devices in one capture session and do you have any idea on what goes wrong here?

Please check the argus_camera in MMAPI sample to confirm.

Thanks

The sample works fine.

Also it works when I (like the sample) create for each camera a separate capture session. But I am curious why it won’t work, putting more than two (up to four) cameras in one capture session.

For the single session support to 3 cameras.
Does the sensors output size and frame rate are the same?

Sorry for the late reply, yes they are the same.

Please confirm by syncSensor or syncStereo sample for single session.

Thanks

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.