libArgus Multi Camera Capture Session

I’m building an application to initialize two camera streams, take an image from both, convert them into cv::Mat objects, and do miscellaneous opencv/cuda work on them (undistortion, etc).

So far I’ve been able to use tegra_multimedia_api/samples/11_camera_object_identification to get one stream working, but trying to activate two streams using the example from tegra_multimedia_api/argus/samples/syncSensor fails with a bunch of not implemented errors below:

Failed to query video capabilities: Bad address
libv4l2_nvvidconv (0):(765) (INFO) : Allocating (10) OUTPUT PLANE BUFFERS Layout=1
libv4l2_nvvidconv (0):(775) (INFO) : Allocating (10) CAPTURE PLANE BUFFERS Layout=0
Inside NvxLiteH264DecoderLowLatencyInitNvxLiteH264DecoderLowLatencyInit set DPB and MjstreamingInside NvxLiteH265DecoderLowLatencyInitNvxLiteH265DecoderLowLatencyInit set DPB and MjstreamingFramerate set to : 30 at NvxVideoEncoderSetParameterSCF: Error NotImplemented: unsupported output buffer config (in src/components/CaptureSetupEngineImpl.cpp, function chooseGenInstFunc(), line 154)
SCF: Error NotImplemented:  (propagating from src/components/CaptureSetupEngineImpl.cpp, function doGetInstructions(), line 1700)
SCF: Error NotImplemented:  (propagating from src/components/CaptureSetupEngine.cpp, function setupCC(), line 167)
SCF: Error NotImplemented:  (propagating from src/api/Session.cpp, function capture(), line 726)
(Argus) Error NotImplemented: Failed to submit first capture request (propagating from src/api/CaptureSessionImpl.cpp, function submitCaptureRequests(), line 298)
(Argus) Error NotImplemented:  (propagating from src/api/CaptureSessionImpl.cpp, function threadFunction(), line 735)

The difference between working and not working is if I initialize the capture session using the vector, or the device:

Works (Can stream 1 camera)

UniqueObj<CaptureSession> captureSession(iCameraProvider->createCaptureSession(cameraDevices[0]))

Does not Work (errors above)

UniqueObj<CaptureSession> captureSession(iCameraProvider->createCaptureSession(cameraDevices))

I am explicitly setting the CameraDevice using the iStreamSettings before creating the stream:

iStreamSettings->setCameraDevice(cameraDevices[0]);
UniqueObj<OutputStream> outputStream(iCaptureSession->createOutputStream(streamSettings.get()));

Is there something I missed to add multiple cameras to a capture session?

Hi Atrer,
What is your camera board with two(or more) cameras?
Are you able to open the second camera with

UniqueObj<CaptureSession> captureSession(iCameraProvider->createCaptureSession(cameraDevices[1]))

Hi Dane!

We’re using custom hardware to run two Leopard Imaging LI-M021C-MIPI global shutter cameras (GitHub - Daxbot/daxc02: Nvidia Jetson TX1/TX2 Kernel Driver for Leopard Imaging LI-M021C-MIPI)

I am able to open and use the second camera with cameraDevices[1] without incident.

My previous version of this program uses the gstreamer library to create two independent streams, but now i’m trying to move to libArgus.

Would it matter that i’m still on l4t-r24.2? The sample is there, so I’d assume not.

Hi Atrer,
There is a multi-sensor sample

tegra_multimedia_api\argus\samples\multiSensor

Can you try it?

I was able to get two cameras working by adding a sizable delay between the initialization and creating a unique CaptureSession per camera. It seems like if the second capture session tries to initialize while the first one is already working it will crash.

Once the stream finally starts and data is flowing the other stream can be created and used.