Questions about using multi-session requests of libargus

I am using libargus (JP4.6.4) to sync and capture frames synchronously from two camera. I create the request of two devices. How can I get the bayer average map of each camera rather than the 1st device to do the custom AE and AWB? Do you have any recommended solution for AE and AWB in this situation?

And one more question, can I say that they are captured at the same time since I saw they have the same timestamp in the metadata?

Thank you.

hello 1356046979,

do you have separate threads to launch the camera?
if yes, you should able to get the metadata from the current EGLStream frame.

What do you mean by ‘separate threads to launch the camera’ ? In our application,

std::vector<CameraDevice*> cameraDevices;
Argus::Status status = iCameraProvider->getCameraDevices(&cameraDevices);
ICaptureSession* iSyncSession = interface_cast<ICaptureSession>(iCameraProvider->createCaptureSession(cameraDevices));
Argus::Request * imgReq = iSyncSession->createRequest(CAPTURE_INTENT_STILL_CAPTURE);

Then I use the request and the camera device to create two cuEGLStreamConsumer in two threads.

ICaptureSession* iCaptureSeession = interface_cast<ICaptureSession>(p_captureSession);
    UniqueObj<OutputStreamSettings> outputSettings(
    IOutputStreamSettings* iStreamSettings =
    IEGLOutputStreamSettings* iIEGLOutputSettings =

    if (iIEGLOutputSettings && iStreamSettings) {
        if (!iIEGLOutputSettings->supportsOutputStreamFormat(
                m_param.sensorMode, getPixelFormat(m_param.pixelFormat))) {
            Log::ERROR("Unsupported pixel format!");
            return false;
        if (eglDisplay == EGL_NO_DISPLAY) {
                Size2D<uint32_t>(m_param.resolution[0], m_param.resolution[1]));
        } else {
            // iIEGLOutputSettings->setResolution(Size2D<uint32_t>(
            //     m_param.preview_resolution[0], m_param.preview_resolution[1]));
        // iIEGLOutputSettings->setFifoLength(10);
    m_OutputStream = UniqueObj<OutputStream>(

And we get the metadata by:

    UniqueObj<EGLStream::MetadataContainer> metadataContainer(
            EGL_NO_DISPLAY, m_iEGLOutputStream->getEGLStream()));
    EGLStream::IArgusCaptureMetadata* iArgusCaptureMetadata =
    CaptureMetadata* metaData = iArgusCaptureMetadata->getMetadata();
    ICaptureMetadata* iMetaData = interface_cast<ICaptureMetadata>(metaData);

In JP 4.6.2 it seems that we can only get the metadata of the 1st camera. So I wonder whether we can get the metadata from the each EGLStream frame in JP 4.6.4 while I created the request and EGLStream in this way? And what’s the difference of getting the metadata from the request and the EGLStream?
Thank you.

what is this output stream for? is it compose your dual camera device?

Sorry for missing the codes. This stream is actually the stream that we created above

m_iEGLOutputStream = interface_cast<IEGLOutputStream>(m_OutputStream);

As you can see each stream is set for one camera.

We connect two IMX477 to jetson xavier nx and use two streams above to read and process frames respectively but they are captured by the same request.

hello 1356046979,

I assume you’ve AE free running, right? (two camera running with separate AE algorithm)
could you please have a quick test…
for example, please covering Cam-A, it should not effect the AE behavior of Cam-B.

No I am sure that both two cameras are sharing the AE and AWB params(Because they are sharing the same request I guess?).

And the phenomenon is libargus only consider the 1st camera for the AE and AWB(we did your experiment
So what we hope to do is we still capture two cameras simultaneously but use one AE (consider all the FOVs convered by two cameras). We found that libargus does not support this feature currently(JP4.6), so we try to get two separate bayerAverageMap and use both two maps to calculate the params of AE and AWB.

hello 1356046979,

since libargus only consider the 1st camera, that’s why single metadata result.

it looks you’re running with single-session instead of multi-session.
please refer to Argus sample code.
for example, Argus/public/apps/camera/ui/multiSession/AppModuleMultiSession.cpp

Thank you. Sorry for my wrong description.

Is there possible to capture two imx477 at the same time but consider all the FOVs convered by two cameras? Do you have any recommended solution?

hello 1356046979,

let’s say you would like to average CAM-A&B AE settings, and apply it to both of the cameras.
am I understand your idea correctly?

Yeah you are right.

So is there any way to get the bayerAverageMap respectively of two cameras?

hello 1356046979,

please refer to Argus example, /usr/src/jetson_multimedia_api/argus/samples/bayerAverageMap
you may running argus_bayeraveragemap sample app by specify --device=INDEX options with each camera device.

Yeah we know how to get a single bayeraveragemap when we created one request with one camera device. But we wonder how to get two bayeraveragemaps when we created one request with two camera device.

As you said we need to average CAM-A&B AE settings, but also we need to capture two camera at the same time (one request with two camera).

Thank you.

hello 1356046979,

you have to use multi-session to obtain two AE settings separately.
and… here’s another example, userAutoExposure to program the exposure time.

Ok I got that. There is no way to get metadata of two camera with one session. Thank you.