Reopen: Volatile exposure time / frame time

Hi,

This is the continuation of the my issue:

There was Jetpack 4.3, Argus lib v97, 2x imx577 sensors from Leopard Imaging.
Now was installed new jp4.6, l4t 32.6.1, Argus lib v98 with latest drivers and the same hardware setup.

But issue still present.
Two cases:

  1. Request N frames, then acquire them from other thread.
  2. Request a frame and immediately acquire it.

In 1st case i got floating/volatile frames exposure:


It is many frames 8px width stacked together.

In 2nd case i got stable exposure but volatile frame time, about 0-10ms difference between every 2 frames. Camera adapter has FPGA sync for all sensors.


It is many frames too but 58px width cause 2nd case spend 4-5 frames ~200ms to start request.
We can see non squared patterns in horizontal direction but clear and solid exposure.

Please help to understanding it.

How make constant exposure time in batch request capture mode?

Please help.

Have reference to /usr/src/jetson_multimedia_api/argus/samples/userAutoExposure

BTW the AE alg need few 3-5 frames to apply new setting.

I need constant exposure for batch request mode. It is possible?

Please provide settings for constant exposure mode.

Do you mean different exposure setting for batch request? Sorry, current design doesn’t support it.

No, i mean fixed exposure time. How to do it?
What settings should i set in c++ code?

Also may be issue depends on wb.
How to set fixed wb in c++ code?

Maybe try Argus::IAutoControlSettings::setAeLock(TRUE) and Argus::IAutoControlSettings::setAwbMode(AWB_MODE_OFF)

https://docs.nvidia.com/jetson/archives/l4t-multimedia-archived/l4t-multimedia-3261/classArgus_1_1IAutoControlSettings.html#a37d259ea62879c41c89e993e137e487d

And what about awblock call?

I did it many times with all combinations.

In single capture mode expo stable, but for batch request capture expo differs for every frame. But the code is same.

For the AE I would suggest to set the Argus::ISourceSettings::setExposureTimeRange() and set narrow the range to the same.

https://docs.nvidia.com/jetson/archives/l4t-multimedia-archived/l4t-multimedia-3261/classArgus_1_1ISourceSettings.html#a97259077b42ecfb13d139da5ede8c2a1

Here is my code:

bool Camera::configureSensor( Argus::IRequest * iRequest )
{
    // Get frame rate
    uint64_t maxFramerate{ uint64_t( 1e9 / ( m_iSensorMode->getFrameDurationRange().min() - 1 ) ) };
    uint64_t minFramerate{ uint64_t( 1e9 / m_iSensorMode->getFrameDurationRange().max() ) + 1 };

    // ISourceSettings
    Argus::ISourceSettings * iSourceSettings{ Argus::interface_cast< Argus::ISourceSettings >( iRequest->getSourceSettings() ) };
    if( not iSourceSettings ) ORIGINATE_ERROR( "Failed to get source settings request interface!\n" );
//    iSourceSettings->setSensorMode( m_sensorMode );

//    Argus::IStreamSettings * streamSettings{ Argus::interface_cast< Argus::IStreamSettings>( iRequest->getStreamSettings( m_streams.at( 0 ).get() ) ) };
//    streamSettings->setPostProcessingEnable( false );
//    streamSettings->setSourceClipRect( { 0.25f, 0.25f, 0.75f, 0.75f } );

    // IDeFogSettings;
    Argus::Ext::IDeFogSettings * iDeFogSettings{ Argus::interface_cast< Argus::Ext::IDeFogSettings >( iRequest->getSourceSettings() ) } ;
    if( not iDeFogSettings ) ORIGINATE_ERROR( "Failed to get IDeFogSettings interface!\n" );
    iDeFogSettings->setDeFogEnable( false );
    iDeFogSettings->setDeFogAmount( 1.f );
    iDeFogSettings->setDeFogQuality( 1.f );

    // IDenoiseSettings
    Argus::IDenoiseSettings * iDenoiseSettings{ Argus::interface_cast< Argus::IDenoiseSettings >( iRequest->getSourceSettings() ) };
    if( not iDenoiseSettings ) ORIGINATE_ERROR( "Failed to get IDenoiseSettings interface!\n" );
    iDenoiseSettings->setDenoiseStrength( 1.f );
    iDenoiseSettings->setDenoiseMode( Argus::DENOISE_MODE_FAST );

    // IEdgeEnhanceSettings
    Argus::IEdgeEnhanceSettings * iEdgeEnhanceSettings{ Argus::interface_cast< Argus::IEdgeEnhanceSettings >( iRequest->getSourceSettings() ) };
    if( not iEdgeEnhanceSettings ) ORIGINATE_ERROR( "Failed to get IEdgeEnhanceSettings interface!\n" );
    iEdgeEnhanceSettings->setEdgeEnhanceStrength( 1.f );
    iEdgeEnhanceSettings->setEdgeEnhanceMode( Argus::EDGE_ENHANCE_MODE_OFF );

    // IAutoControlSettings
    Argus::IAutoControlSettings * iAutoControlSettings{ Argus::interface_cast< Argus::IAutoControlSettings >( iRequest->getAutoControlSettings() ) };
    if( not iAutoControlSettings ) ORIGINATE_ERROR( "Failed to get IAutoControlSettings interface!\n" );

    iAutoControlSettings->setAeAntibandingMode( Argus::AE_ANTIBANDING_MODE_OFF );

    iSourceSettings->setFrameDurationRange( { m_fpsTime } );
    iSourceSettings->setGainRange( { m_analogGain } );
    //    iSourceSettings->setOpticalBlackEnable( true );
    iSourceSettings->setExposureTimeRange( { m_exposureTime } );
    iAutoControlSettings->setAeLock( true );

    iAutoControlSettings->setAwbMode( m_awbAuto ? Argus::AWB_MODE_AUTO : Argus::AWB_MODE_MANUAL );
    Argus::BayerTuple< float > wb;
    wb[ 0 ] = float( m_wbGains.at( 0 ) );
    wb[ 1 ] = float( m_wbGains.at( 1 ) );
    wb[ 2 ] = float( m_wbGains.at( 2 ) );
    wb[ 3 ] = float( m_wbGains.at( 3 ) );
    if( not m_awbAuto )
    {
        iAutoControlSettings->setWbGains( wb );
    }
    iAutoControlSettings->setAwbLock( false );
//    iAutoControlSettings->setAwbLock( not m_awbAuto );


    iAutoControlSettings->setIspDigitalGainRange( { m_digitalGain } );
    Argus::Range< float > ispGainRange{ iAutoControlSettings->getIspDigitalGainRange() };

    iAutoControlSettings->setColorCorrectionMatrixEnable( m_useCCM );
    iAutoControlSettings->setColorCorrectionMatrix( m_ccm );

    
    return true;
}

This code the same for batch and for single request.

But the exposure time differs:

Top is single request, bottom is batch.

I don’t understand what this picture what to tell?
Also what do you mean batch a d single request.

Explanation.

  1. Batch capture. Requesting N frames and process them.
bool Camera::batchCapture()
{
    m_abortBatchCapture = false;
    std::thread t{ & Camera::dispatch, this };

    for( size_t i{}; i < m_captureEveryFrameN * m_scanShiftNumber * m_camerasNum; ++ i )
    {
        Argus::Status status{};
        const auto result{ m_iCaptureSession->capture( m_request.get(), Argus::TIMEOUT_INFINITE, & status ) };
        if( result == 0 )
            ORIGINATE_ERROR( "Failed to submit capture request (status %x)", status );
        if( m_abortBatchCapture ) break;
    }

    t.join();
    return true;
}

void Camera::dispatch()
{
    size_t frameCounter{};
    while( true )
    {        
        std::vector< EGLStream::Image * > images( m_camerasNum );
        size_t cameraId{};
        for( cameraId = 0; cameraId < m_camerasNum; ++ cameraId )
        {
            Argus::Status status{};
            Argus::UniqueObj< EGLStream::Frame > frame{ m_iFrameConsumers.at( cameraId )->acquireFrame( Argus::TIMEOUT_INFINITE, & status ) };

            EGLStream::IFrame * iFrame{ Argus::interface_cast< EGLStream::IFrame >( frame ) };

            images.at( cameraId ) = iFrame->getImage();            
        }

        processFrames( images, m_captureCropYLeft, m_captureCropYRight, frameCounter );
        frameCounter ++;
        
        
        if( frameCounter >= m_scanShiftNumber ) break;
        if( m_abortBatchCapture ) break;
    }
}
  1. Single capture. Requesting 1 frame and process it.
bool capture()
{
        Argus::Status status{};
        const auto result{ m_iCaptureSession->capture( m_request.get(), Argus::TIMEOUT_INFINITE, & status ) };
        if( result == 0 )
            ORIGINATE_ERROR( "Failed to submit capture request (status %x)", status );

        size_t cameraId{};
        for( cameraId = 0; cameraId < m_camerasNum; ++ cameraId )
        {
            Argus::Status status{};
            Argus::UniqueObj< EGLStream::Frame > frame{ m_iFrameConsumers.at( cameraId )->acquireFrame( Argus::TIMEOUT_INFINITE, & status ) };
            EGLStream::IFrame * iFrame{ Argus::interface_cast< EGLStream::IFrame >( frame ) };
            images.at( cameraId ) = iFrame->getImage();
        }

        processFrames( images, m_captureCropYLeft, m_captureCropYRight, counter );
        return true;
}

Green picture shows the difference of these 2 cases. Frames stitched together.

Top side: capture N frames using case 2 code.
Bottom side: request N frames and dispatch them using case 1 code.

Top side has stable exposure as we can see (every request ISP takes 3-5 frames and latest frame is a result, it takes ~200ms). There is shadow from lights, but it is not floating exposure.
Bottom has floating exposure (all requested frames has ~33ms framerate).

configureSensor() code the same for both cases.

So “single capture” case is a hack, for fixing exposure bug. Cause 200ms was ok for our task.
But now we need batch capture with real fps.
It is old bug existing in all Jetson platforms (or Argus or Leopard Imaging drivers/hardware). Or my misunderstanding or Jetson hardware limitation.

It is not lights or power supply issue. We did many tests with different power sources and all type of lights.

To reproduce it you can take lightgray background, any led lights and take N frames from 50cm distance inside closed box. Our target exposure is 10e6.

Hi, please help.

Suppose if set the exposure range to a fixed value shouldn’t have the volatile problem.

Ok. Will test today.

Hi, i made tests.
It is about 300 frames. Every frame cropped 24px from center. Captured from fixed position 90 cm distance in closed box with led lights. 10fps, 2.5e6 exposure time, wblock, aelock, gains 1.
As you see randomly it has different exposure.