I’m building an application to initialize two camera streams, take an image from both, convert them into cv::Mat objects, and do miscellaneous opencv/cuda work on them (undistortion, etc).
So far I’ve been able to use tegra_multimedia_api/samples/11_camera_object_identification to get one stream working, but trying to activate two streams using the example from tegra_multimedia_api/argus/samples/syncSensor fails with a bunch of not implemented errors below:
Failed to query video capabilities: Bad address
libv4l2_nvvidconv (0):(765) (INFO) : Allocating (10) OUTPUT PLANE BUFFERS Layout=1
libv4l2_nvvidconv (0):(775) (INFO) : Allocating (10) CAPTURE PLANE BUFFERS Layout=0
Inside NvxLiteH264DecoderLowLatencyInitNvxLiteH264DecoderLowLatencyInit set DPB and MjstreamingInside NvxLiteH265DecoderLowLatencyInitNvxLiteH265DecoderLowLatencyInit set DPB and MjstreamingFramerate set to : 30 at NvxVideoEncoderSetParameterSCF: Error NotImplemented: unsupported output buffer config (in src/components/CaptureSetupEngineImpl.cpp, function chooseGenInstFunc(), line 154)
SCF: Error NotImplemented: (propagating from src/components/CaptureSetupEngineImpl.cpp, function doGetInstructions(), line 1700)
SCF: Error NotImplemented: (propagating from src/components/CaptureSetupEngine.cpp, function setupCC(), line 167)
SCF: Error NotImplemented: (propagating from src/api/Session.cpp, function capture(), line 726)
(Argus) Error NotImplemented: Failed to submit first capture request (propagating from src/api/CaptureSessionImpl.cpp, function submitCaptureRequests(), line 298)
(Argus) Error NotImplemented: (propagating from src/api/CaptureSessionImpl.cpp, function threadFunction(), line 735)
The difference between working and not working is if I initialize the capture session using the vector, or the device:
Works (Can stream 1 camera)
UniqueObj<CaptureSession> captureSession(iCameraProvider->createCaptureSession(cameraDevices[0]))
Does not Work (errors above)
UniqueObj<CaptureSession> captureSession(iCameraProvider->createCaptureSession(cameraDevices))
I am explicitly setting the CameraDevice using the iStreamSettings before creating the stream:
iStreamSettings->setCameraDevice(cameraDevices[0]);
UniqueObj<OutputStream> outputStream(iCaptureSession->createOutputStream(streamSettings.get()));
Is there something I missed to add multiple cameras to a capture session?