Multiple outputStreams with a multi camera captureSession

Hallo,

I have one captureSession, which should manage up to three cameras and supply two frame consumers.
One consumer renders a Preview to the screen and the other save the current frame(s) as jpg.

As long as only one camera is connected everything works fine. Both outputStreams to both consumers (one stream for each consumer) get the frames and the consumers can process them correctly.

Also if i have one consumer with multiple cameras connected (one consumer manages multiple streams), everything works fine. The consumer can read the frames from all streams and process them.

However if I activate two outputStreams on two cameras (this means 4 stream in total and one consumer gets one stream from each camera), I don’t get any frame at all on any outputStream.

Is this an inherent limitation of libargus or is there something else I have to consider to get frames in this scenario?

Suppose shouldn’t have any of this kind of limitation.
Like you can simulate by argus_camera --device=0 and argus_camera --device=1 and both of them have output JPEG output stream.

I also don’t have any problems if i use two cameras, with one output stream for each camera, at the same time.

However, if I want to use two cameras with two output stream for each camera I don’t get any frames delivered.

When you run two argus_camera with JPEG encoder that’s 4 output steam. Two preview streams and two JPEG streams.

When I start the argus_camera with camera 0 it seems to work. If I switch to camera 1 I get the following error message:

Executing Argus Sample Application (argus_camera)
Argus Version: 0.98.3 (multi-process)
Error generated. /usr/src/jetson_multimedia_api/argus/samples/utils/Validator.h, checkValid:670 Value '21.8' out of range. Valid values need to be in the range [3.40282e+38, 21.8].
Error generated. /usr/src/jetson_multimedia_api/argus/samples/utils/Value.h, set:94 (propagating)
Error generated. /usr/src/jetson_multimedia_api/argus/apps/camera/modules/Dispatcher.cpp, onSensorModeIndexChanged:1176 (propagating)
Error generated. /usr/src/jetson_multimedia_api/argus/samples/utils/Observed.cpp, notifyObservers:92 (propagating)
Error generated. /usr/src/jetson_multimedia_api/argus/samples/utils/Value.h, set:98 (propagating)
Error generated. /usr/src/jetson_multimedia_api/argus/apps/camera/modules/Dispatcher.cpp, onDeviceIndexChanged:1122 (propagating)
Error generated. /usr/src/jetson_multimedia_api/argus/samples/utils/Observed.cpp, notifyObservers:92 (propagating)
Error generated. /usr/src/jetson_multimedia_api/argus/samples/utils/Value.h, set:98 (propagating)
Error generated. /usr/src/jetson_multimedia_api/argus/samples/utils/Value.h, setFromString:112 (propagating)
Error generated. /usr/src/jetson_multimedia_api/argus/samples/utils/Options.h, valueCallback:320 (propagating)
Error generated. /usr/src/jetson_multimedia_api/argus/samples/utils/Options.cpp, parse:246 (propagating)
Error generated. /usr/src/jetson_multimedia_api/argus/apps/camera/ui/common/App.cpp, run:88 (propagating)

However the MultiSession seems to work normally.

Is there any progress on this?

It could be your second sensor gain/exposure/frame rate report incorrect.

I have two identical cameras connected. Both use the same diver and the same settings. I don’t understand why camera 0 works and camera1 dosn’t.

What’s your version?
Did you get the same version of MMAPI sample code?

I am using Jetpack version 4.6.2 and L4T 32.7.2 .
I used the MMAPI samples witch where installed with the API via the SDK manager.
The Argus camera API is the version 0.98 (from the RELEASE.TXT file)

Does any gain/frame rate range is in the range [3.40282e+38, 21.8]?

I don’t know where the 3.40282e+38 are taken from, the supported range should be [0, 21.8].

However, if I use my application I can access both cameras (also simultaneously), if I only use one Stream for each camera.
But I still can’t connect two streams to two (or three) cameras each.