[Closed] How to set WDR DOL sensor mode

Hi All,

I want to use wdr mode in my app.
argus_camera doing it well and it looks like hdr image.
I am watching argus_camera sources but nothing there - only switching modeIndex in onSensorModeChanged.

Sensor has 4 modes in ext mode.

Here is my code:

// Setting mode

    Argus::ISourceSettings * iSourceSettings_{ Argus::interface_cast< Argus::ISourceSettings >( iRequest->getSourceSettings() ) };
    if( !iSourceSettings_ ) ORIGINATE_ERROR( "Failed to get ISourceSettings_ interface" );
    iSourceSettings_->setSensorMode( sensorModes[ sensorModeIndex ] );

// Checking sensor mode
    Argus::SensorMode * sm{ iSourceSettings->getSensorMode() };
    Argus::ISensorMode * iSM{ Argus::interface_cast< Argus::ISensorMode >( sm ) };
    if( !iSensorMode ) ORIGINATE_ERROR( "Failed to get sensor mode interface!" );

    Argus::Ext::IPwlWdrSensorMode * pwlMode{ Argus::interface_cast< Argus::Ext::IPwlWdrSensorMode >( sm ) };
    Argus::Ext::IDolWdrSensorMode * dolMode{ Argus::interface_cast< Argus::Ext::IDolWdrSensorMode >( sm ) };
    if( pwlMode )
    {
        std::cout << "WDR Pwl @" << iSM->getInputBitDepth() << "bpp -> " <<
                     iSensorMode->getOutputBitDepth() << "bpp\n";
    } else if( dolMode )
    {
        std::cout << "WDR Dol @" << iSM->getOutputBitDepth() << "bpp -> " <<
                     dolMode->getExposureCount() << " exposure" << " DOL WDR\n";
    } else
    {
        std::cout << "Simple @" << iSM->getOutputBitDepth() << "bpp\n";
    }

The result is: WDR Dol @10bpp -> 2 exposure DOL WDR
As in argus_camera.

But frame is brown now.

How to set WDR mode or how to use it?
May be EGLFrame has more planes/buffers?

Merging of exposures it is auto function or i should do it myself?

Bset regards, Viktor.

hello vsw,

DOL-WDR is kind of sensor capability to capture a frame with different exposure settings. (usually, short exposure and long exposure)
this feature could produce better capture results for the scene request for high dynamic range.
according to Jetson AGX Xavier Software Features, we had validated DOL-WDR with Sony IMX274.
please contact with Jetson Preferred Partner if you’re asking more camera solution supports,
thanks

Thanks Jerry Chang. Already have imx274. The question is where in argus_camera 2 exposured frames become single frame. Example of code needed.

Here is different.

From argus camera, WDR DOL mode:

From my app WDR DOL mode:

What else i need to do for getting true HDR image as in argus_camera app?

Here is code for saving image in my app:

void ConsumerThread::saveJpeg( EGLStream::IFrame * iFrame, const int32_t & currentFrame )
{
    EGLStream::IImageJPEG * iJPEG{ Argus::interface_cast< EGLStream::IImageJPEG >( iFrame->getImage() ) };    
    if( iJPEG )
    {
        const std::chrono::system_clock::time_point now{ std::chrono::system_clock::now() };
        const std::string path{ recordPath + "/" + time_point_to_string( now ) + "_" + std::to_string( currentFrame ) + ".jpg" };
        Argus::Status status{ iJPEG->writeJPEG( path.c_str() ) };
        if( status != Argus::STATUS_OK )
            CONSUMER_PRINT( "\tFailed to write JPEG: %s (status %d)!\n", path, status );
    }
}

The same problem for video encoding.

Here is log:

CUDA Device Query (Driver API) statically linked version 
Detected 1 CUDA Capable device(s)

Device 0: "NVIDIA Tegra X2"
  CUDA Driver Version:                           10.0
  CUDA Capability Major/Minor version number:    6.2
  Total amount of global memory:                 7861 MBytes (8242810880 bytes)
  ( 2) Multiprocessors, (128) CUDA Cores/MP:     256 CUDA Cores
  GPU Max Clock rate:                            1020 MHz (1.02 GHz)
  Memory Clock rate:                             1300 Mhz
  Memory Bus Width:                              128-bit
  L2 Cache Size:                                 524288 bytes
  Max Texture Dimension Sizes                    1D=(131072) 2D=(131072, 65536) 3D=(16384, 16384, 16384)
  Maximum Layered 1D Texture Size, (num) layers  1D=(32768), 2048 layers
  Maximum Layered 2D Texture Size, (num) layers  2D=(32768, 32768), 2048 layers
  Total amount of constant memory:               65536 bytes
  Total amount of shared memory per block:       49152 bytes
  Total number of registers available per block: 32768
  Warp size:                                     32
  Maximum number of threads per multiprocessor:  2048
  Maximum number of threads per block:           1024
  Max dimension size of a thread block (x,y,z): (1024, 1024, 64)
  Max dimension size of a grid size (x,y,z):    (2147483647, 65535, 65535)
  Texture alignment:                             512 bytes
  Maximum memory pitch:                          2147483647 bytes
  Concurrent copy and kernel execution:          Yes with 1 copy engine(s)
  Run time limit on kernels:                     No
  Integrated GPU sharing Host Memory:            Yes
  Support host page-locked memory mapping:       Yes
  Concurrent kernel execution:                   Yes
  Alignment requirement for Surfaces:            Yes
  Device has ECC support:                        Disabled
  Device supports Unified Addressing (UVA):      Yes
  Supports Cooperative Kernel Launch:            Yes
  Supports MultiDevice Co-op Kernel Launch:      Yes
  Device PCI Domain ID / Bus ID / location ID:   0 / 0 / 0
NPP Library Version 10.0.166
Num CPU's 6, 6
PRODUCER: Prepare CUDA maps...
PRODUCER: Argus Version: 0.97.3 (multi-process)
(null)UUID:                      adca2c00,0f01,11e5,0002,00,00,00,00,00,00
(null)MaxAeRegions:              64
(null)MaxAwbRegions:             64
(null)FocusPositionRange:        [0, 0]
(null)LensApertureRange:         [2.200000, 2.200000]
(null)IspDigitalGainRange:       [1.000000, 256.000000]
(null)ExposureCompensationRange: [-2.000000, 2.000000]
(null)NumSensorModes:            4
(null)SensorMode 0:
(null)    Resolution:         3840x2160
(null)    ExposureTimeRange:  [44000, 478696000]
(null)    FrameDurationRange: [16666667, 666667072]
(null)                        (1.50 to 60.00 fps)
(null)    AnalogGainRange:    [1.000000, 44.400002]
(null)    InputBitDepth:      10
(null)    OutputBitDepth:     10
(null)    SensorModeType:     SENSOR_MODE_TYPE_BAYER
(null)    IS WDR Mode: No
(null)SensorMode 1:
(null)    Resolution:         1920x1080
(null)    ExposureTimeRange:  [58000, 184611000]
(null)    FrameDurationRange: [16666667, 666667072]
(null)                        (1.50 to 60.00 fps)
(null)    AnalogGainRange:    [1.000000, 177.000000]
(null)    InputBitDepth:      10
(null)    OutputBitDepth:     10
(null)    SensorModeType:     SENSOR_MODE_TYPE_BAYER
(null)    IS WDR Mode: No
(null)SensorMode 2:
(null)    Resolution:         3840x2160
(null)    ExposureTimeRange:  [864000, 20480000]
(null)    FrameDurationRange: [33333334, 666667072]
(null)                        (1.50 to 30.00 fps)
(null)    AnalogGainRange:    [1.000000, 30.000000]
(null)    InputBitDepth:      10
(null)    OutputBitDepth:     10
(null)    SensorModeType:     SENSOR_MODE_TYPE_BAYER
(null)    DOL WDR Mode Properties:
(null)      ExposureCount:        2
(null)      OpticalBlackRowCount: 14
(null)      VBPRowCounts:         [50]
(null)      LineInfoMarkerWidth:  4
(null)      LeftMarginWidth:      12
(null)      RightMarginWidth:     0
(null)      PhysicalResolution:   3856x4448
(null)SensorMode 3:
(null)    Resolution:         1920x1080
(null)    ExposureTimeRange:  [859000, 15649000]
(null)    FrameDurationRange: [16666667, 666667072]
(null)                        (1.50 to 60.00 fps)
(null)    AnalogGainRange:    [1.000000, 177.000000]
(null)    InputBitDepth:      10
(null)    OutputBitDepth:     10
(null)    SensorModeType:     SENSOR_MODE_TYPE_BAYER
(null)    DOL WDR Mode Properties:
(null)      ExposureCount:        2
(null)      OpticalBlackRowCount: 14
(null)      VBPRowCounts:         [38]
(null)      LineInfoMarkerWidth:  4
(null)      LeftMarginWidth:      6
(null)      RightMarginWidth:     6
(null)      PhysicalResolution:   1936x2264
Using WDR SensorMode (index = 2)
PRODUCER: Creating output stream
PRODUCER: Launching consumer thread
Failed to query video capabilities: Inappropriate ioctl for device
Opening in BLOCKING MODE 
NvMMLiteOpen : Block : BlockType = 4 
===== NVMEDIA: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 4 
875967048
842091865
capture_plane.getNumBuffers: 6
H264: Profile = 100, Level = 51 
CONSUMER: Waiting until producer is connected...
Fps min: 2, max: 30
getExposureTimeRange 864000, 15649000
getFocusPosition 0
getGainRange 1, 30
getOpticalBlackEnable 0
getAeLock 0
getColorCorrectionMatrixEnable 0
getColorSaturation 1
getColorSaturationBias 1
getColorSaturationEnable 0
getExposureCompensation 0
getIspDigitalGainRange 1, 1
getToneMapCurveEnable 0
PRODUCER: Starting repeat capture requests.

Also strange warning:

PRODUCER: Launching consumer thread
Failed to query video capabilities: Inappropriate ioctl for device
Opening in BLOCKING MODE 
NvMMLiteOpen : Block : BlockType = 4 
===== NVMEDIA: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 4

It happens after calling:

m_VideoEncoder = NvVideoEncoder::createVideoEncoder( "NVENC0" );
if( !m_VideoEncoder ) ORIGINATE_ERROR( "Could not create m_VideoEncoderoder" );

in ConsumerThread::createVideoEncoder in my app.
But in argus_camera there no errors and Setting Frame rate message presents:

Framerate set to : 30 at NvxVideoEncoderSetParameterNvMMLiteOpen : Block : BlockType = 4 
Started recording video at 3840x2160, saving to './video0000'
===== NVMEDIA: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 4

hello vsw,

suggest you refer to L4T Multimedia API Reference and check the [Multimedia API Sample Applications] for the examples.
you might refer to [09_camera_jpeg_capture] sample, and you’ll also found it launches default sensor mode for capturing image buffer.
since there’s mode selection in the Argus sample, could you please modify the source and launch DOL sensor mode for verification.
thanks

Thanks. Will try.
I made some tests.
All argus samples(including argus camera app) gives good wdr images with WDR DOL mode.
But in my programm image corrupted with brown hue and lines.

Different is:

  1. i am using 2 sensors at a time
    2. i am using threadconsumer with EGLFrame (your app multiStream uses Frame in JpegConsumer to save)

Other settings is equal now.

Will make more tests.

It works with WDR mode too.

But my code does not.

Here is my code where i removed all stuff excluding things to save WDR jpeg:

// sys
#include <iostream>
#include <fstream>
#include <chrono>
#include <math.h>
#include <omp.h>

// utils
#include <Error.h>
#include <Thread.h>

// include
#include <Argus/Argus.h>
#include <Argus/Settings.h>
#include <ArgusHelpers.h>
#include <Argus/Ext/BayerSharpnessMap.h>
#include <Argus/Ext/DeFog.h>
#include <Argus/Ext/InternalFrameCount.h>
#include <Argus/Ext/SensorPrivateMetadata.h>
#include <Argus/Ext/PwlWdrSensorMode.h>
#include <Argus/Ext/DolWdrSensorMode.h>
#include <EGLStream/EGLStream.h>
#include <EGLStream/NV/ImageNativeBuffer.h>
#include <NvVideoEncoder.h>
#include <NvApplicationProfiler.h>
#include <EGL/egl.h>
#include <EGL/eglext.h>
#include <GLES2/gl2.h>
#include <GLES2/gl2ext.h>
#include <NvEglRenderer.h>
#include <NvJpegEncoder.h>
#include <Util.h>
#include <Value.h>

// cuda samples
#include <Exceptions.h>

// hc
#include "utils.hpp"

class ConsumerThread : public ArgusSamples::Thread
{
public:
    explicit ConsumerThread( Argus::OutputStream * streamLeft, Argus::OutputStream * streamRight );
    ~ConsumerThread();

    bool isInError()
    {
        return m_gotError;
    }

private:
    /** @name Thread methods */
    /**@{*/
    virtual bool threadInitialize();
    virtual bool threadExecute();
    virtual bool threadShutdown();
    /**@}*/

void abort();

Argus::OutputStream * m_streamLeft, * m_streamRight;
    Argus::UniqueObj< EGLStream::FrameConsumer > m_consumerLeft, m_consumerRight;

    bool m_gotError{};

public:

};

ConsumerThread::ConsumerThread( Argus::OutputStream * streamLeft, Argus::OutputStream * streamRight ) :
        m_streamLeft( streamLeft ),
        m_streamRight( streamRight )
{

}

ConsumerThread::~ConsumerThread()
{

}

bool ConsumerThread::threadInitialize()
{

    // Create the FrameConsumer.
    m_consumerLeft = Argus::UniqueObj< EGLStream::FrameConsumer >( EGLStream::FrameConsumer::create( m_streamLeft ) );
    if( !m_consumerLeft ) ORIGINATE_ERROR( "Failed to create left FrameConsumer" );

    m_consumerRight = Argus::UniqueObj< EGLStream::FrameConsumer >( EGLStream::FrameConsumer::create( m_streamRight ) );
    if( !m_consumerRight ) ORIGINATE_ERROR( "Failed to create right FrameConsumer" );

return true;
}

bool ConsumerThread::threadExecute()
{
    Argus::IEGLOutputStream * iStreamLeft{ Argus::interface_cast< Argus::IEGLOutputStream >( m_streamLeft ) };
    EGLStream::IFrameConsumer * iFrameConsumerLeft{ Argus::interface_cast< EGLStream::IFrameConsumer >( m_consumerLeft ) };
    Argus::IEGLOutputStream * iStreamRight{ Argus::interface_cast< Argus::IEGLOutputStream >( m_streamRight ) };
    EGLStream::IFrameConsumer * iFrameConsumerRight{ Argus::interface_cast< EGLStream::IFrameConsumer >( m_consumerRight ) };

    // Wait until the producer has connected to the stream.
    CONSUMER_PRINT( "Waiting until producer is connected...\n" );
    if( iStreamLeft->waitUntilConnected() != Argus::STATUS_OK ) ORIGINATE_ERROR( "Stream left failed to connect." );
    if( iStreamRight->waitUntilConnected() != Argus::STATUS_OK ) ORIGINATE_ERROR( "Stream right failed to connect." );

    Argus::UniqueObj<EGLStream::Frame> frame(iFrameConsumerLeft->acquireFrame());
    EGLStream::IFrame *iFrame2 = Argus::interface_cast<EGLStream::IFrame>(frame);
    if( iFrame2 )
    {
        // Get the Frame's Image.
        EGLStream::Image *image = iFrame2->getImage();
        EGLStream::IImageJPEG *iJPEG = Argus::interface_cast<EGLStream::IImageJPEG>(image);
        if (iJPEG)
        {
            // Write the Image to disk as JPEG.
            const std::string path{ recordPath + "/" + time_point_to_string( now ) + "_x.jpg" };
            iJPEG->writeJPEG(path.c_str());
        } else std::cout << "Bad Jpg!\n";
    } else std::cout << "Bad IFrame!\n";

STOP = true;

    requestShutdown();

    CONSUMER_PRINT( "Done.\n" );

    return true;
}

bool ConsumerThread::threadShutdown()
{
    return true;
}

void ConsumerThread::abort()
{
    
    m_gotError = true;
}

void sigint( const int32_t signum )
{
    if( signum >= 0 ) std::cout << " Stop recording ctrl-c. " << signum << std::endl;
    STOP = true;
}

bool execute()
{
    eglDisplay = eglGetDisplay( EGL_DEFAULT_DISPLAY );
    if( eglDisplay == EGL_NO_DISPLAY )
    {
        printf( "Cannot get EGL display.\n" );
        return EXIT_FAILURE;
    }

    // Initialize the Argus camera provider.
    Argus::UniqueObj< Argus::CameraProvider > cameraProvider{ Argus::CameraProvider::create() };
    Argus::ICameraProvider * iCameraProvider{ Argus::interface_cast< Argus::ICameraProvider >( cameraProvider ) };
    if( !iCameraProvider ) ORIGINATE_ERROR( "Failed to create CameraProvider!" );
    PRODUCER_PRINT( "Argus Version: %s\n", iCameraProvider->getVersion().c_str() );

    // Get the camera devices.
    std::vector< Argus::CameraDevice * > cameraDevices;
    iCameraProvider->getCameraDevices( & cameraDevices );
    if( cameraDevices.size() < 2 ) ORIGINATE_ERROR( "< 2 cameras available!" );
    std::vector < Argus::CameraDevice * > lrCameras;
    lrCameras.push_back( cameraDevices[ 0 ] );
    lrCameras.push_back( cameraDevices[ 1 ] );    

    // Get sensor modes
    Argus::CameraDevice * cameraDevice{ ArgusSamples::ArgusHelpers::getCameraDevice( cameraProvider.get(), 0 ) };
    if( !cameraDevice ) ORIGINATE_ERROR( "Selected camera device is not available!" );
    ArgusSamples::ArgusHelpers::printCameraDeviceInfo( cameraDevice, nullptr );

    Argus::SensorMode * sensorMode{ ArgusSamples::ArgusHelpers::getWdrSensorMode( cameraDevice ) };
//    Argus::SensorMode * sensorMode{ ArgusSamples::ArgusHelpers::getSensorMode( cameraDevice, 0 ) };
    Argus::ISensorMode * iSensorMode{ Argus::interface_cast< Argus::ISensorMode >( sensorMode ) };
    if( !iSensorMode ) ORIGINATE_ERROR( "Failed to get sensor mode interface!" );

    // Create the capture sessions
    Argus::UniqueObj< Argus::CaptureSession > captureSession{ iCameraProvider->createCaptureSession( lrCameras ) };
    Argus::ICaptureSession * iCaptureSession{ Argus::interface_cast< Argus::ICaptureSession >( captureSession ) };
    if( !iCaptureSession ) ORIGINATE_ERROR( "Failed to get ICaptureSession interface!" );

    // Create the OutputStreamSettings settings.
    PRODUCER_PRINT( "Creating output stream\n" );
    Argus::UniqueObj< Argus::OutputStreamSettings > outputStreamSettings{ iCaptureSession->createOutputStreamSettings( Argus::STREAM_TYPE_EGL ) };    
    Argus::IOutputStreamSettings * iOutputStreamSettings{ Argus::interface_cast< Argus::IOutputStreamSettings >( outputStreamSettings ) };
    if( !iOutputStreamSettings ) ORIGINATE_ERROR( "Failed to create iOutputStreamSettings!" );

    // Create the IEGLOutputStreamSettings settings.
    Argus::IEGLOutputStreamSettings * iEGLOutputStreamSettings{ Argus::interface_cast< Argus::IEGLOutputStreamSettings >( outputStreamSettings ) };
    if( !iEGLOutputStreamSettings ) ORIGINATE_ERROR( "Failed to create iEGLOutputStreamSettings!" );
    iEGLOutputStreamSettings->setPixelFormat( Argus::PIXEL_FMT_YCbCr_420_888 );
//    iEGLOutputStreamSettings->setEGLDisplay( eglDisplay );
    iEGLOutputStreamSettings->setResolution( iSensorMode->getResolution() );
    iEGLOutputStreamSettings->setMetadataEnable( true );
//    iEGLOutputStreamSettings->setExposureCount( 2 );
//    std::cout << "getExposureCount: " << iEGLOutputStreamSettings->getExposureCount() << "\n";
    std::cout << "getResolution: " << iSensorMode->getResolution().width() << "x" << iSensorMode->getResolution().height() << "\n";

    // Create egl streams
    iOutputStreamSettings->setCameraDevice( ArgusSamples::ArgusHelpers::getCameraDevice( cameraProvider.get(), 0 ) );
    Argus::UniqueObj< Argus::OutputStream > streamLeft{ iCaptureSession->createOutputStream( outputStreamSettings.get() ) };
    iOutputStreamSettings->setCameraDevice( ArgusSamples::ArgusHelpers::getCameraDevice( cameraProvider.get(), 1 ) );
    Argus::UniqueObj< Argus::OutputStream > streamRight{ iCaptureSession->createOutputStream( outputStreamSettings.get() ) };

    // Launch the FrameConsumer thread to consume frames from the OutputStream.
    PRODUCER_PRINT( "Launching consumer thread\n" );
    ConsumerThread frameConsumerThread{ streamLeft.get(), streamRight.get() };

    PROPAGATE_ERROR( frameConsumerThread.initialize() );
    // Wait until the consumer is connected to the stream.
    PROPAGATE_ERROR( frameConsumerThread.waitRunning() );

    // Create capture request.
    Argus::UniqueObj< Argus::Request > request{ iCaptureSession->createRequest( Argus::CAPTURE_INTENT_VIDEO_RECORD ) };
    Argus::IRequest * iRequest{ Argus::interface_cast< Argus::IRequest >( request ) };

    //  Enable output streams
    if( !iRequest ) ORIGINATE_ERROR( "Failed to create Request!" );
    if( iRequest->enableOutputStream( streamLeft.get() ) != Argus::STATUS_OK ) ORIGINATE_ERROR( "Failed to enable left stream in Request!" );
    if( iRequest->enableOutputStream( streamRight.get() ) != Argus::STATUS_OK ) ORIGINATE_ERROR( "Failed to enable right stream in Request!" );

    // Get frame rate
    uint32_t maxFramerate{ ( 1e9 / ( iSensorMode->getFrameDurationRange().min() - 1 ) ) };
    uint32_t minFramerate{ ( 1e9 / iSensorMode->getFrameDurationRange().max() ) + 1 };
    std::cout << "Fps min: " << minFramerate << ", max: " << maxFramerate << "\n";

    // ISourceSettings
    Argus::ISourceSettings * iSourceSettings{ Argus::interface_cast< Argus::ISourceSettings >( iRequest->getSourceSettings() ) };
    if( !iSourceSettings ) ORIGINATE_ERROR( "Failed to get source settings request interface!" );
    iSourceSettings->setSensorMode( sensorMode );

    iSourceSettings->setFrameDurationRange( Argus::Range< uint64_t >{ 1, uint64_t( 1e9 / maxFramerate ) } );
    iSourceSettings->setExposureTimeRange( Argus::Range< uint64_t >{ 864000, 20480000 } );
    Argus::Range< float > gainRange{ 1.f, float( maxFramerate ) };
    iSourceSettings->setGainRange( gainRange );
//    iSourceSettings->setOpticalBlackEnable( true );

    // Argus is now all setup and ready to capture
    // Submit capture requests.
    PRODUCER_PRINT( "Starting repeat capture requests.\n" );
    if( iCaptureSession->repeat( request.get() ) != Argus::STATUS_OK ) ORIGINATE_ERROR( "Failed to start repeat capture request!" );

// Wait for CAPTURE_TIME seconds.
    for( int32_t i{}; i < CAPTURE_TIME && ! frameConsumerThread.isInError(); i ++ )
    {
        if( STOP ) break;
        sleep( 1 ); // seconds
//        std::cout << STOP << "...\n";
    }

    // Stop the repeating request and wait for idle.
    iCaptureSession->stopRepeat();
    iCaptureSession->waitForIdle();

    // Destroy the output stream to end the consumer thread.
    streamLeft.reset();
    streamRight.reset();

    // Wait for the consumer thread to complete.
    PROPAGATE_ERROR( frameConsumerThread.shutdown() );

    // Shut down Argus.
    cameraProvider.reset();

//    eglTerminate( m_eglDisplay );
    eglTerminate( eglDisplay );
    std::cout << "Executed.\n";

    return true;
}

int32_t main( int32_t argc, char * argv[] )
{

execute();

std::cout << "exit...\n";
    std::cout.flush();

    return 0;
}

It is very similar to all argus samples. But the frame is corrupted.
<b>May be i should to set the sensor/stream settings in special order?</b>

hello vsw,

please debug into your application and find the root-cause.
thanks

Now the problem cause is closer.

I made more tests and if i am using only ONE camera then WDR image is ok.
If captureSession created with vector of devices(>1) then images are damaged.

Please help.

Also may be problem is here:

// Create egl streams
iOutputStreamSettings->setCameraDevice( ArgusSamples::ArgusHelpers::getCameraDevice( cameraProvider.get(), 0 ) );
Argus::UniqueObj< Argus::OutputStream > streamLeft{ iCaptureSession->createOutputStream( outputStreamSettings.get() ) };
iOutputStreamSettings->setCameraDevice( ArgusSamples::ArgusHelpers::getCameraDevice( cameraProvider.get(), 1 ) );
Argus::UniqueObj< Argus::OutputStream > streamRight{ iCaptureSession->createOutputStream( outputStreamSettings.get() ) };

When i am using 1 camera, then setCameraDevice is not necessery.

Updated.

And it is not my code problem.
If i use multiSession mode in argus_camera and select DOL then app freezes.

If set deviceCount(in multiSession.cpp) to actual cameras number then argus_camera working in multiSession DOL WDR mode.
Cause my board has 3 camera ports(driver registers 3 devices anyway) but i am using only 2 cameras.

But still my code wrong.
It is somthing around setting parameters for every sensor…

In argus_camera multiSession for every camera app creates session. But in my approach i create single session for 2 cameras cause i need to process 2 frames together. It is the first different.
How to use single session for 2 cameras EGL streams and set equal sensor settings?

Should i use single session and request per sensor?

But:

A capture session is bound to one or more sensors, and each sensor can be bound to only one capture
session.

May be it is around here, 0.97 argus pdf says:
Support for multiple sensors, including both separate control over independent sensors and
access to synchronized multi-sensor configurations. (The latter is unsupported in the current
release.)

Have tried all stuff for each sensor (including independed session and request) and WDR mode works for multi sensor.

Why single session and single request work fine for base sensor mode and work with corrupted WDR images in DOL mode?

And what the different of these approaches?

hello vsw,

glad to know you had some progress, please refer to below of these approaches.

single-session:
usually working with simultaneous capture or multi-frame synchronization use-case, there’s only one software capture events and all other sensor were sharing same configurations.

multi-session:
this will create several capture sessions for each sensor, each of sensor have individual configurations.

Thanks.

So why single session works in base mode and does not in DOL mode?

hello vsw,

since DOL capture result depends on different exposure time fusion, AE algorithm running individually for each camera sensor is necessary.
you might also attach corrupted WDR results in DOL mode for us checking. thanks

It is here:

From argus camera, WDR DOL mode(single sensor), fine:

From my app WDR DOL mode(single session, multi sensor), corrupted: