Argus manual exposure issue

Hello,

I have a problem with libargus when I try to set a manual exposure time:
I have a sensor which default frame rate is 60fps.
I set it to 30fps.
I set the exposure to 30ms (this is a valid value for 30fps).
I capture an image and read the metadata, libargus tells me that the exposure is 30ms but if I check the embedded metadata of the image, the actual exposure time is 16ms, which is the value at 60fps.

When I put some debug messages in the sensor driver, I see the following behavior:

  • frame rate is set at 60fps: this is the default value
  • frame rate is set at 30fps: this is the requested value
  • exposure time is set at 30ms: this is the requested value
  • frame rate is set at 30fps: I don’t know where this call is from
  • frame rate is set at 60fps: I don’t know where this call is from but it does change the exposure time to 16ms (max value at 60fps)
  • frame rate is set back at 30fps but the exposure time stays at 16ms

Do you have any explanation for the frame rate changes?
I did try to change the frame rate and the exposure with v4l2-ctl and I don’t have this problem.

Regards,
Ben

Did you implement your own API?
Have you verify with sample APP argus_camera?

I have written a small program based on argus samples in which I set the frame rate and the exposure:

// Set frame rate to 30fps:
uint64_t userFramerate = 30;
uint64_t userFrameDuration = 1000000000 / userFramerate;
printf("Framerate = %ju fps (Frame Duration = %ju)\n", userFramerate, userFrameDuration);
status = iSourceSettings->setFrameDurationRange(userFrameDuration);
if (status != STATUS_OK)
    ORIGINATE_ERROR("Unable to set the new framerate");

// Set the exposure time
uint64_t userExposureTime = 30000000;
printf("Exposure = %ju\n", userExposureTime);
status = iSourceSettings->setExposureTimeRange(Range<uint64_t>(userExposureTime));
if (status != STATUS_OK)
    ORIGINATE_ERROR("Unable to set the new exposure time");

After these commands I don’t call setFrameDurationRange or setExposureTimeRange again, but they are changed anyway as I explained in my original post.

It’s difficult with argus_camera because there are many automatic controls that are applied.

Could you share your source and binary here?

Hello,
Here is my source code :

/*
 * Copyright (c) 2016 - 2018, NVIDIA CORPORATION. All rights reserved.
 *
 * Redistribution and use in source and binary forms, with or without
 * modification, are permitted provided that the following conditions
 * are met:
 *  * Redistributions of source code must retain the above copyright
 *    notice, this list of conditions and the following disclaimer.
 *  * Redistributions in binary form must reproduce the above copyright
 *    notice, this list of conditions and the following disclaimer in the
 *    documentation and/or other materials provided with the distribution.
 *  * Neither the name of NVIDIA CORPORATION nor the names of its
 *    contributors may be used to endorse or promote products derived
 *    from this software without specific prior written permission.
 *
 * THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS ``AS IS'' AND ANY
 * EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
 * IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
 * PURPOSE ARE DISCLAIMED.  IN NO EVENT SHALL THE COPYRIGHT OWNER OR
 * CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL,
 * EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO,
 * PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
 * PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY
 * OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
 * (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
 * OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
 */

#include <Argus/Argus.h>
#include <Argus/Ext/BayerAverageMap.h>
#include <Argus/Ext/NonLinearHistogram.h>
#include <Argus/Ext/SensorPrivateMetadata.h>

#include "ArgusHelpers.h"
#include "CommonOptions.h"

#include <iomanip>

#include "EGLGlobal.h"
#include <EGLStream/EGLStream.h>
#include "GLContext.h"
#include "Error.h"
#include "Thread.h"
#include "Window.h"

namespace ArgusSamples
{

using namespace Argus;
using namespace EGLStream;

// Globals
EGLDisplayHolder g_display;

// Debug print macros
#define PRODUCER_PRINT(...)         printf("PRODUCER: " __VA_ARGS__)
#define PREVIEW_CONSUMER_PRINT(...) printf("PREVIEW CONSUMER: " __VA_ARGS__)

// Convert a pixel number in embedded data line index
// This is valid for the first line only. For the second line an offset should be added to pixel (2 * line_length)
uint16_t embeddedDataIndex(uint16_t pixel) {
    // Dummy bytes are inserted depending on the RAW format, see Software Reference Manual figures 4-11 to 4-13 on page 57
    // RAW10 format:
    return pixel - 1 + ((pixel - 1) / 4);
}

/*******************************************************************************
 * Extended options class to add additional options specific to this sample.
 ******************************************************************************/
class MyAppOptions : public CommonOptions {
public:
    MyAppOptions(const char *programName)
        : CommonOptions(programName,
                        ArgusSamples::CommonOptions::Option_R_WindowRect |
                        ArgusSamples::CommonOptions::Option_F_FrameCount)
    {
    }

protected:

};

/*******************************************************************************
 * Argus Consumer thread:
 *   Opens an on-screen GL window and renders a live camera preview
 ******************************************************************************/
class PreviewConsumerThread : public Thread {
public:
    explicit PreviewConsumerThread(EGLDisplay display, EGLStreamKHR stream)
        : m_display(display)
        , m_stream(stream)
        , m_texture(0)
        , m_program(0)
        , m_textureUniform(-1) {
    }
    ~PreviewConsumerThread() {
    }

private:
    /** @name Thread methods */
    /**@{*/
    virtual bool threadInitialize();
    virtual bool threadExecute();
    virtual bool threadShutdown();
    /**@}*/

    EGLDisplay m_display;
    GLContext m_context;
    EGLStreamKHR m_stream;
    GLuint m_texture;
    GLuint m_program;
    GLint m_textureUniform;
    Size2D<uint32_t> m_windowSize;
};

bool PreviewConsumerThread::threadInitialize() {
    // The Window class is defined in utils/gtk/Window.h if WINDOW_GUI_SUPPORT = WINDOW_GUI_GTK
    // Window is derived from the WindowBase defined in utils/WindowBase.h
    Window &window = Window::getInstance();

    // Create the context and make it current.
    PREVIEW_CONSUMER_PRINT("Creating OpenGL context.\n");
    switch(WINDOW_GUI_SUPPORT) {
        case 0 :
            PREVIEW_CONSUMER_PRINT("WINDOW_GUI_SUPPORT = WINDOW_GUI_NONE\n");
            break;
        case 1 :
            PREVIEW_CONSUMER_PRINT("WINDOW_GUI_SUPPORT = WINDOW_GUI_GTK\n");
            break;
        case 2 :
            PREVIEW_CONSUMER_PRINT("WINDOW_GUI_SUPPORT = WINDOW_GUI_GLX\n");
            break;
        case 3 :
            PREVIEW_CONSUMER_PRINT("WINDOW_GUI_SUPPORT = WINDOW_GUI_ANDROID\n");
            break;
        default :
            PREVIEW_CONSUMER_PRINT("WINDOW_GUI_SUPPORT undefined\n");
    }
    PROPAGATE_ERROR(m_context.initialize(&window));
    PROPAGATE_ERROR(m_context.makeCurrent());

    // Get window size.
    GLint viewport[4];
    glGetIntegerv(GL_VIEWPORT, viewport);
    m_windowSize = Argus::Size2D<uint32_t>(viewport[2], viewport[3]);

    // Create the shader program to render a texture.
    static const GLfloat quadCoords[] = {1.0f, 0.0f, 1.0f, 1.0f, 0.0f, 0.0f, 0.0f, 1.0f};
    static const char vtxSrc[] =
        "#version 300 es\n"
        "in layout(location = 0) vec2 coord;\n"
        "out vec2 texCoord;\n"
        "void main() {\n"
        "  gl_Position = vec4((coord * 2.0) - 1.0, 0.0, 1.0);\n"
        // Note: Argus frames use a top-left origin and need to be inverted for GL texture use.
        "  texCoord = vec2(coord.x, 1.0 - coord.y);\n"
        "}\n";
    static const char frgSrc[] =
        "#version 300 es\n"
        "#extension GL_OES_EGL_image_external : require\n"
        "precision highp float;\n"
        "uniform samplerExternalOES texSampler;\n"
        "in vec2 texCoord;\n"
        "out vec4 fragColor;\n"
        "void main() {\n"
        "  fragColor = texture2D(texSampler, texCoord);\n"
        "}\n";
    PROPAGATE_ERROR(m_context.createProgram(vtxSrc, frgSrc, &m_program));
    glUseProgram(m_program);
    m_textureUniform = glGetUniformLocation(m_program, "texSampler");
    glUniform1i(m_textureUniform, 0);
    glEnableVertexAttribArray(0);
    glVertexAttribPointer(0, 2, GL_FLOAT, GL_FALSE, 0, quadCoords);

    // Create an external texture and connect it to the stream as the consumer.
    PREVIEW_CONSUMER_PRINT("Connecting to EGLStream.\n");
    glGenTextures(1, &m_texture);
    glBindTexture(GL_TEXTURE_EXTERNAL_OES, m_texture);
    if (!eglStreamConsumerGLTextureExternalKHR(m_display, m_stream))
        ORIGINATE_ERROR("Unable to connect GL as consumer");

    PREVIEW_CONSUMER_PRINT("Connected to stream.\n");

    return true;
}

bool PreviewConsumerThread::threadExecute() {
    EGLint state = EGL_STREAM_STATE_CONNECTING_KHR;

    // Wait until the Argus producer is connected.
    PREVIEW_CONSUMER_PRINT("Waiting until producer connect...\n");
    while (true) {
        if (!eglQueryStreamKHR(m_display, m_stream, EGL_STREAM_STATE_KHR, &state))
            ORIGINATE_ERROR("Failed to query stream state (possible producer failure).");
        if (state == EGL_STREAM_STATE_NEW_FRAME_AVAILABLE_KHR)
            break;
    }
    PREVIEW_CONSUMER_PRINT("Producer connected; continuing.\n");

    uint32_t frame = 0;
    uint64_t sensorTimestampOffset = 0;
    std::ostringstream awbModeStream;

    // There are 2 ways of displaying the stream:
    // - display new frames when they become available (this is what is used here);
    // - display a stream at 60fps. If the producer stream comes at a smaller speed then the same frame can be displayed more than once.
    //   To use this method, loop with this condition "while (eglStreamConsumerAcquireKHR(m_display, m_stream))"
    bool done = false;
    while (!done && !m_doShutdown) {

        if (m_doShutdown)
            printf("m_doShutdown detected!\n");

        bool newFrameAvailable = false;
        if (!eglQueryStreamKHR(m_display, m_stream, EGL_STREAM_STATE_KHR, &state) || state == EGL_STREAM_STATE_DISCONNECTED_KHR) {
            done = true;
            break;
        }
        else if (state == EGL_STREAM_STATE_NEW_FRAME_AVAILABLE_KHR) {
            newFrameAvailable = true;
            if (!eglStreamConsumerAcquireKHR(m_display, m_stream)) {
                done = true;
                break;
            }
        }

        if (!done) {
            if (newFrameAvailable) {

                frame += 1;

                // Render the image.
                glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);

                // Display frame counter
                m_context.setTextSize(16.0f / m_windowSize.height(), (float)m_windowSize.height() / (float)m_windowSize.width());
                float yPosition = 0.98;
                m_context.setTextPosition(0.02f, yPosition);
                std::ostringstream counterStream;
                counterStream << "Frame: " << std::setw(3) << frame;
                m_context.renderText(counterStream.str().c_str());

                // Get the metadata from the current EGLStream frame.
                // Note: This will likely fail for the last frame since the producer has
                //       already disconected from the EGLStream, so we need to handle
                //       failure gracefully.
                UniqueObj<EGLStream::MetadataContainer> metadataContainer(EGLStream::MetadataContainer::create(m_display, m_stream));
                EGLStream::IArgusCaptureMetadata *iArgusCaptureMetadata = interface_cast<EGLStream::IArgusCaptureMetadata>(metadataContainer);
                if (iArgusCaptureMetadata)
                {
                    CaptureMetadata *metadata = iArgusCaptureMetadata->getMetadata();
                    const ICaptureMetadata* iMetadata = interface_cast<const ICaptureMetadata>(metadata);
                    if (!iMetadata)
                        ORIGINATE_ERROR("Failed to get Argus metadata\n");

                    // Display timestamp
                    uint64_t sensorTimestamp = iMetadata->getSensorTimestamp();
                    if (sensorTimestampOffset == 0)
                        sensorTimestampOffset = sensorTimestamp;
                    uint64_t sensorTimestampMilliseconds = (sensorTimestamp - sensorTimestampOffset) / 1000000;
                    uint64_t sensorTimestampSeconds = sensorTimestampMilliseconds / 1000;
                    uint64_t sensorTimestampMinutes = sensorTimestampSeconds / 60;
                    sensorTimestampMilliseconds = sensorTimestampMilliseconds - (sensorTimestampSeconds * 1000);
                    sensorTimestampSeconds = sensorTimestampSeconds - (sensorTimestampMinutes * 60);
                    yPosition -= 0.02;
                    m_context.setTextPosition(0.02f, yPosition);
                    std::ostringstream timestampStream;
                    timestampStream << "Timestamp: " << sensorTimestampMinutes << ":" << std::setfill('0') << std::setw(2) << sensorTimestampSeconds << ":" << std::setfill('0') << std::setw(3) << sensorTimestampMilliseconds;
                    m_context.renderText(timestampStream.str().c_str());

                    // Spacing
                    yPosition -= 0.02;

                    // Display framerate
                    yPosition -= 0.02;
                    m_context.setTextPosition(0.02f, yPosition);
                    std::ostringstream framerateStream;
                    framerateStream << "Framerate (fps): " << (float)(1000000000 / iMetadata->getFrameDuration());
                    m_context.renderText(framerateStream.str().c_str());

                    // Display exposure time
                    yPosition -= 0.02;
                    m_context.setTextPosition(0.02f, yPosition);
                    std::ostringstream exposureStream;
                    exposureStream << "Exposure time (us): " << (float)iMetadata->getSensorExposureTime() / 1000.0;
                    m_context.renderText(exposureStream.str().c_str());

                    // Get the sensor private metadata.
                    const Ext::ISensorPrivateMetadata *iSensorMetadata = interface_cast<const Ext::ISensorPrivateMetadata>(metadata);
                    if (!iSensorMetadata)
                        ORIGINATE_ERROR("Failed to get ISensorPrivateMetadata Interface");
                    size_t sensorMetadataSize = iSensorMetadata->getMetadataSize();
                    char* sensorMetadata = new char[sensorMetadataSize];
                    if (!sensorMetadata)
                        ORIGINATE_ERROR("Failed to allocate metadata buffer");
                    if (iSensorMetadata->getMetadata(sensorMetadata, sensorMetadataSize) != STATUS_OK)
                        ORIGINATE_ERROR("Failed to get sensor private metadata");

                    // Spacing
                    yPosition -= 0.02;

                    // Display sensor frame counter
                    yPosition -= 0.02;
                    m_context.setTextPosition(0.02f, yPosition);
                    std::ostringstream sensorMetadataFrmCnt;
                    sensorMetadataFrmCnt << "FRM_CNT: " << std::setw(3) << (uint16_t)sensorMetadata[embeddedDataIndex(7)];
                    m_context.renderText(sensorMetadataFrmCnt.str().c_str());

                    // Display sensor coarse integration time and equivalent exposure time
                    // T sh = T line × (COARSE_INTEG_TIME [lines] + FINE_INTEG_TIME [pixels] / LINE_LENGTH_PCK [pixels/line])
                    uint32_t sensorCoarseIntegTime = (uint16_t)sensorMetadata[embeddedDataIndex(39)] * 256 + (uint16_t)sensorMetadata[embeddedDataIndex(41)];
                    uint32_t sensorFineIntegTime = (uint16_t)sensorMetadata[embeddedDataIndex(35)] * 256 + (uint16_t)sensorMetadata[embeddedDataIndex(37)];
                    uint32_t sensorLineLengthPck = (uint16_t)sensorMetadata[embeddedDataIndex(127)] * 256 + (uint16_t)sensorMetadata[embeddedDataIndex(129)];
                    // T line = LINE_LENGTH_PCK [pixels/line] × IVTPXCK_period / 4 (Total number of image pipe lines)
                    float sensorIvtpxckPeriod = 1.0 / 209.925;
                    float sensorTline = sensorLineLengthPck * sensorIvtpxckPeriod / 4;
                    float sensorTsh = sensorTline * (sensorCoarseIntegTime + (sensorFineIntegTime / sensorLineLengthPck));

                    yPosition -= 0.02;
                    m_context.setTextPosition(0.02f, yPosition);
                    std::ostringstream sensorCoarseIntegTimeStream;
                    sensorCoarseIntegTimeStream << "COARSE_INTEG_TIME: " << sensorCoarseIntegTime;
                    m_context.renderText(sensorCoarseIntegTimeStream.str().c_str());
                    yPosition -= 0.02;
                    m_context.setTextPosition(0.02f, yPosition);
                    std::ostringstream sensorExposureTime;
                    sensorExposureTime << "Exposure time (us): " << sensorTsh;
                    m_context.renderText(sensorExposureTime.str().c_str());

                    delete[] sensorMetadata;
                }

                PROPAGATE_ERROR(m_context.swapBuffers());
            }
        }
    }

    PREVIEW_CONSUMER_PRINT("No more frames. Cleaning up.\n");
    PREVIEW_CONSUMER_PRINT("%s\n", awbModeStream.str().c_str());
    PREVIEW_CONSUMER_PRINT("%d frames were acquired by the consumer.\n", frame);

    PROPAGATE_ERROR(requestShutdown());

    return true;
}

bool PreviewConsumerThread::threadShutdown() {
    glDeleteProgram(m_program);
    glDeleteTextures(1, &m_texture);
    m_context.cleanup();

    PREVIEW_CONSUMER_PRINT("Done.\n");

    return true;
}

/*******************************************************************************
 * Argus Producer thread:
 *   Opens the Argus camera driver, creates an OutputStream to be displayed
 ******************************************************************************/
static bool execute(const MyAppOptions& options) {

    const uint32_t SelectedDevice = 0;
    const uint32_t SelectedMode = 0;

    // Initialize the window and EGL display
    Window &window = Window::getInstance();
    window.setWindowRect(options.windowRect());
    PROPAGATE_ERROR(g_display.initialize(window.getEGLNativeDisplay()));

    // Create the CameraProvider to establish libargus driver connection
    UniqueObj<CameraProvider> cameraProvider(CameraProvider::create());
    // Get the ICameraProvider interface from the global CameraProvider
    // From the ICameraProvider we will get a camera device (1) and create a capture session (2)
    ICameraProvider *iCameraProvider = interface_cast<ICameraProvider>(cameraProvider);
    if (!iCameraProvider)
        ORIGINATE_ERROR("Failed to get ICameraProvider interface");
    printf("Argus Vendor: %s\n", iCameraProvider->getVendor().c_str());
    printf("Argus Version: %s\n", iCameraProvider->getVersion().c_str());

    /*
     * 1. Get a camera device and set it to the correct mode
     */

    // Query available CameraDevices from ICameraProvider interface
    // Based on the number of devices connected, the camera device pointers will be returned by the 'getCameraDevices()' API
    std::vector<CameraDevice*> cameraDevices;
    iCameraProvider->getCameraDevices(&cameraDevices);
    if (cameraDevices.size() == 0)
        ORIGINATE_ERROR("No camera found");
    printf("Number of cameras found: %lu\n", cameraDevices.size());

    // Select a camera device
    CameraDevice *cameraDevice = cameraDevices[SelectedDevice];
    if (!cameraDevice)
        ORIGINATE_ERROR("Selected camera device is not available");

    // Make sure sensor private metadata is supported by the device
    const Ext::ISensorPrivateMetadataCaps *iSensorMetadataCaps = interface_cast<const Ext::ISensorPrivateMetadataCaps>(cameraDevice);
    if (!iSensorMetadataCaps)
        ORIGINATE_ERROR("Failed to get ISensorPrivateMetadataCaps Interface");

    // Get the device properties and capabilities
    ICameraProperties *iCameraProperties = interface_cast<ICameraProperties>(cameraDevice);
    if (!iCameraProperties)
        ORIGINATE_ERROR("Failed to get ICameraProperties interface");

    // Get available sensor modes
    std::vector<SensorMode*> sensorModes;
    Argus::Status status = iCameraProperties->getAllSensorModes(&sensorModes);
    if (status != STATUS_OK)
        ORIGINATE_ERROR("Failed to get sensor modes from device");

    // Select a mode, the sensorMode will also be used when creating the request
    SensorMode* sensorMode = sensorModes[SelectedMode];
    ISensorMode *iSensorMode = interface_cast<ISensorMode>(sensorMode);
    if (!iSensorMode)
        ORIGINATE_ERROR("Selected sensor mode not available");

    // Display sensor mode info
    printf("Sensor mode %d:\n", SelectedMode);
    ArgusSamples::ArgusHelpers::printSensorModeInfo(sensorMode, "    ");

    // Get the framerate and exposure ranges
    Range<uint64_t> limitFrameDurationRange = iSensorMode->getFrameDurationRange();
    printf("Sensor Frame Duration Range min %ju, max %ju\n", limitFrameDurationRange.min(), limitFrameDurationRange.max());
    Range<uint64_t> limitExposureTimeRange = iSensorMode->getExposureTimeRange();
    printf("Sensor Exposure Range min %ju, max %ju\n", limitExposureTimeRange.min(), limitExposureTimeRange.max());

    /*
     * 2. Create a capture session from the CameraProvider interface
     *    It will control all operations on the sensor
     */

    // Create the capture session using the camera device.
    // From there we will get an interface to the core CaptureSession methods (iCaptureSession) (2.1)
    // and an interface for an object which generates Events (2.2)
    UniqueObj<CaptureSession> captureSession(iCameraProvider->createCaptureSession(cameraDevice));
    // Interface to the core CaptureSession methods. From there 3 objects will be created:
    // - an OutputStreamSettings object that is used to configure the creation of the OutputStream (2.1.1)
    // - an OutputStream object using the settings configured by the OutputStreamSettings object (2.1.2)
    // - a request object that will be used with this CaptureSession (2.1.3)
    ICaptureSession *iCaptureSession = interface_cast<ICaptureSession>(captureSession);
    if (!iCaptureSession)
        ORIGINATE_ERROR("Failed to create CaptureSession");

    // 2.1.1 Creates the OutputStreamSettings object that will be used to configure the creation of an OutputStream.
    //       Check https://docs.nvidia.com/jetson/archives/l4t-multimedia-archived/l4t-multimedia-282/classArgus_1_1IOutputStreamSettings.html
    //       for the settings available.
    UniqueObj<OutputStreamSettings> outputStreamSettings(iCaptureSession->createOutputStreamSettings(STREAM_TYPE_EGL));
    IEGLOutputStreamSettings *iEGLStreamSettings = interface_cast<IEGLOutputStreamSettings>(outputStreamSettings);
    if (!iEGLStreamSettings)
        ORIGINATE_ERROR("Failed to create IEGLOutputStreamSettings");
    iEGLStreamSettings->setPixelFormat(PIXEL_FMT_YCbCr_420_888);
    iEGLStreamSettings->setResolution(Size2D<uint32_t>(options.windowRect().width(), options.windowRect().height()));
    iEGLStreamSettings->setEGLDisplay(g_display.get());
    // Sets the mode of the OutputStream.
    // Available options are:
    // - MAILBOX (default):
    //   In this mode, only the newest frame is made available to the consumer. When libargus completes a frame it empties the mailbox
    //   and inserts the new frame into the mailbox. The consumer then retrieves the frame from the mailbox and processes it; when finished,
    //   the frame is either placed back into the mailbox (if the mailbox is empty) or discarded (if the mailbox is not empty). This mode
    //   implies 2 things:
    //   - If the consumer consumes frames slower than libargus produces frames, then some frames may be lost (never seen by the consumer).
    //   - If the consumer consumes frames faster than libargus produces frames, then the consumer may see some frames more than once.
    // - FIFO:
    //   When using this mode, every producer frame is made available to the consumer through the use of a fifo queue for the frames. When
    //   using this mode, the fifo queue length must be specified using setFifoLength. When libargus completes a frame it inserts it to the
    //   head of the fifo queue. If the fifo is full (already contains the number of frames equal to the fifo queue length), libargus will
    //   stall until the fifo is no longer full. The consumer consumes frames from the tail of the queue; however, if the consumer releases
    //   a frame while the queue is empty, the frame is set aside and will be returned again the next time the consumer requests a frame if
    //   another new frame has not been inserted into the fifo queue before then. Once a new frame is inserted into the fifo queue, any
    //   previously released frame will be permanently discarded. This mode implies:
    //   - Frames are never discarded until the consumer has processed them.
    //   - If the consumer consumes frames slower than libargus produces them, libargus will stall.
    //   - If the consumer consumes frames faster than libargus produces them, then the consumer may see some frames more than once.
    //   In this mode use setFifoLength() to set the FIFO queue length of the stream.
    iEGLStreamSettings->setMode(EGL_STREAM_MODE_MAILBOX);
    // Enables the sensor metadata: enabling this will allow an EGLStream::MetadataContainer to be created from frames acquired on the
    // consumer side of the EGLStream that will expose the EGLStream::IArgusCaptureMetadata interface, which in turn provides access to
    // the CaptureMetadata corresponding to that frame. This will also enable the IArgusCaptureMetadata interface directly on
    // EGLStream::Frames acquired by an EGLStream::FrameConsumer.
    iEGLStreamSettings->setMetadataEnable(true);

    // 2.1.2 Create OutputStream that is consumed by the preview (OpenGL) consumer
    printf("Creating preview output stream\n");
    UniqueObj<OutputStream> previewStream(iCaptureSession->createOutputStream(outputStreamSettings.get()));
    IEGLOutputStream *iPreviewStream = interface_cast<IEGLOutputStream>(previewStream);
    if (!iPreviewStream)
        ORIGINATE_ERROR("Failed to create Preview Stream");

    // Launch the consumer thread to consume frames from the OutputStream's EGLStream
    printf("Launching preview consumer thread\n");
    PreviewConsumerThread previewConsumerThread(iPreviewStream->getEGLDisplay(), iPreviewStream->getEGLStream());
    PROPAGATE_ERROR(previewConsumerThread.initialize());
    PROPAGATE_ERROR(previewConsumerThread.waitRunning());

    // 2.1.3 Create capture request
    //       The CaptureIntent instructs the driver to populate the request with recommended settings for that intent.
    //       For example, a PREVIEW intent may disable post-processing in order to reduce latency and resource usage while a STILL_CAPTURE
    //       intent will enable post-processing in order to optimize still image quality.
    //       Available options: CAPTURE_INTENT_MANUAL, CAPTURE_INTENT_PREVIEW, CAPTURE_INTENT_STILL_CAPTURE, CAPTURE_INTENT_VIDEO_RECORD, CAPTURE_INTENT_VIDEO_SNAPSHOT
    // CAPTURE_INTENT_MANUAL: 16,491ms, 40fps
    // CAPTURE_INTENT_PREVIEW: 16,491ms, 30fps
    // CAPTURE_INTENT_STILL_CAPTURE: 16,491ms, 30fps
    // CAPTURE_INTENT_VIDEO_RECORD: 16,491ms, 30fps
    // CAPTURE_INTENT_VIDEO_SNAPSHOT: 16,491ms, 30fps
    // none:
    const CaptureIntent intent = CAPTURE_INTENT_PREVIEW;
    UniqueObj<Request> request(iCaptureSession->createRequest(intent));
    IRequest *iRequest = interface_cast<IRequest>(request);
    if (!iRequest)
        ORIGINATE_ERROR("Failed to get capture request interface");

    // The internal post-processing pipeline is generated on a per-request basis, and is dependent on the full set of enabled output streams that have
    // post-processing enabled. In order to prevent these pipeline changes, which may cause visual changes in the preview stream, post-processing
    // is disabled for the periodic still capture.
    IStreamSettings *streamSettings = interface_cast<IStreamSettings>(iRequest->getStreamSettings(previewStream.get()));
    streamSettings->setPostProcessingEnable(false);

    // Enable sensor private metadata output in the request
    Ext::ISensorPrivateMetadataRequest *iSensorPrivateMetadataRequest = interface_cast<Ext::ISensorPrivateMetadataRequest>(request);
    if (!iSensorPrivateMetadataRequest)
        ORIGINATE_ERROR("Failed to get ISensorPrivateMetadata interface");
    iSensorPrivateMetadataRequest->setMetadataEnable(true);

    // Get the source settings for the request
    // Check file:///home/bmichel/Downloads/nvidia/nvl4t_docs/classArgus_1_1ISourceSettings.html
    printf("Get sensor settings\n");
    ISourceSettings *iSourceSettings = interface_cast<ISourceSettings>(iRequest->getSourceSettings());
    if (!iSourceSettings)
        ORIGINATE_ERROR("Failed to get source settings interface");

    // Get the auto control settings for the request
    IAutoControlSettings* iAutoControlSettings = interface_cast<IAutoControlSettings>(iRequest->getAutoControlSettings());
    if (!iAutoControlSettings)
        ORIGINATE_ERROR("Failed to get AutoControlSettings interface");

    // Enable the output stream: captures made with this request will produce output on the previewStream
    status = iRequest->enableOutputStream(previewStream.get());
    if (status != STATUS_OK)
        ORIGINATE_ERROR("Failed to enable stream in capture request");

    // Set frame rate to 30fps:
    uint64_t userFramerate = 30;
    uint64_t userFrameDuration = 1000000000 / userFramerate;
    printf("Framerate = %ju fps (Frame Duration = %ju)\n", userFramerate, userFrameDuration);
    status = iSourceSettings->setFrameDurationRange(userFrameDuration);
    if (status != STATUS_OK)
        ORIGINATE_ERROR("Unable to set the new framerate");

    // Check the settings:
    // Frame duration time range, in nanoseconds (determines frame rate)
    Range<uint64_t> sensorFrameDurationRange = iSourceSettings->getFrameDurationRange();
    printf("FrameDurationRange (ns): [%ju:%ju]\n", sensorFrameDurationRange.min(), sensorFrameDurationRange.max());
    // Exposure time range, in nanoseconds
    Range<uint64_t> sensorExposureTimeRange = iSourceSettings->getExposureTimeRange();
    printf("ExposureTimeRange (ns): [%ju:%ju]\n", sensorExposureTimeRange.min(), sensorExposureTimeRange.max());

    // Set the exposure time
    uint64_t userExposureTime = 30000000;
    printf("Exposure = %ju\n", userExposureTime);
    status = iSourceSettings->setExposureTimeRange(Range<uint64_t>(userExposureTime));
    if (status != STATUS_OK)
        ORIGINATE_ERROR("Unable to set the new exposure time");

    // Set the sensor gain
    status = iSourceSettings->setGainRange(2);
    if (status != STATUS_OK)
        ORIGINATE_ERROR("Unable to set the gain");

    // 2.2 Interface for an object which generates Events
    IEventProvider *iEventProvider = interface_cast<IEventProvider>(captureSession);
    if (!iEventProvider)
        ORIGINATE_ERROR("iEventProvider is NULL");

    std::vector<EventType> eventTypes;
    eventTypes.push_back(EVENT_TYPE_CAPTURE_COMPLETE);
    UniqueObj<EventQueue> queue(iEventProvider->createEventQueue(eventTypes));
    IEventQueue *iQueue = interface_cast<IEventQueue>(queue);
    if (!iQueue)
        ORIGINATE_ERROR("event queue interface is NULL");

    // Set up a repeating request: it will queue a request whenever the request queue is empty and the camera is ready to accept new requests
    // There are 4 different types of requests in order to capture frame:
    // - capture: single session single capture
    // - captureBurst: multiple session single capture
    // - repeat: single session continuous capture until stopRepeat() is called
    // - repeatBurst: multiple session continuous capture until stopRepeat() is called
    status = iCaptureSession->repeat(request.get());
    if (status != STATUS_OK)
        ORIGINATE_ERROR("Unable to submit repeat() request");

    for (uint32_t frameCaptureLoop = 0; frameCaptureLoop < options.frameCount(); frameCaptureLoop++) {
        // Keep PREVIEW display window serviced
        window.pollEvents();

        const uint64_t ONE_SECOND = 1000000000;
        iEventProvider->waitForEvents(queue.get(), ONE_SECOND);
        if(iQueue->getSize() == 0)
            ORIGINATE_ERROR("No events in queue");

        const Event* event = iQueue->getEvent(iQueue->getSize() - 1);
        const IEventCaptureComplete *iEventCaptureComplete = interface_cast<const IEventCaptureComplete>(event);
        if(!iEventCaptureComplete)
            ORIGINATE_ERROR("Failed to get EventCaptureComplete Interface");

        const CaptureMetadata *metaData = iEventCaptureComplete->getMetadata();
        const ICaptureMetadata* iMetadata = interface_cast<const ICaptureMetadata>(metaData);
        if (!iMetadata)
            ORIGINATE_ERROR("Failed to get CaptureMetadata Interface");

        uint64_t frameExposureTime = iMetadata->getSensorExposureTime();
        float frameGain = iMetadata->getSensorAnalogGain();

        // Get sensor private metadata
        const Ext::ISensorPrivateMetadata *iSensorMetadata = interface_cast<const Ext::ISensorPrivateMetadata>(metaData);
        if (!iSensorMetadata)
            ORIGINATE_ERROR("Failed to get ISensorPrivateMetadata Interface");

        size_t sensorMetadataSize = iSensorMetadata->getMetadataSize();
        char* sensorMetadata = new char[sensorMetadataSize];
        if (!sensorMetadata)
            ORIGINATE_ERROR("Failed to allocate metadata buffer");

        status = iSensorMetadata->getMetadata(sensorMetadata, sensorMetadataSize);
        if (status != STATUS_OK)
            ORIGINATE_ERROR("Failed to get sensor private metadata");

        printf("Exposure Time: %.3f ms, Analog Gain: %f\n", frameExposureTime / 1000000.0, frameGain);
        printf("FRM_CNT: %3d, ", (uint16_t)sensorMetadata[embeddedDataIndex(7)]);
        printf("COARSE_INTEG_TIME: %4d, ", (uint16_t)sensorMetadata[embeddedDataIndex(39)] * 256 +
                                           (uint16_t)sensorMetadata[embeddedDataIndex(41)]);
        printf("ANA_GAIN_GLOBAL: %5d, ", (uint16_t)sensorMetadata[embeddedDataIndex(43)] * 256 +
                                         (uint16_t)sensorMetadata[embeddedDataIndex(45)]);
        printf("FINE_INTEG_TIME: %4d, ", (uint16_t)sensorMetadata[embeddedDataIndex(35)] * 256 +
                                         (uint16_t)sensorMetadata[embeddedDataIndex(37)]);
        printf("FRM_LENGTH_LINES: %4d, ", (uint16_t)sensorMetadata[embeddedDataIndex(123)] * 256 +
                                          (uint16_t)sensorMetadata[embeddedDataIndex(125)]);
        printf("LINE_LENGTH_PCK: %4d ", (uint16_t)sensorMetadata[embeddedDataIndex(127)] * 256 +
                                        (uint16_t)sensorMetadata[embeddedDataIndex(129)]);
        printf("\n");

        // Exposure time calculation
        // T sh = T line × (COARSE_INTEG_TIME [lines] + FINE_INTEG_TIME [pixels] / LINE_LENGTH_PCK [pixels/line])
        uint32_t sensorCoarseIntegTime = (uint16_t)sensorMetadata[embeddedDataIndex(39)] * 256 + (uint16_t)sensorMetadata[embeddedDataIndex(41)];
        uint32_t sensorFineIntegTime = (uint16_t)sensorMetadata[embeddedDataIndex(35)] * 256 + (uint16_t)sensorMetadata[embeddedDataIndex(37)];
        uint32_t sensorLineLengthPck = (uint16_t)sensorMetadata[embeddedDataIndex(127)] * 256 + (uint16_t)sensorMetadata[embeddedDataIndex(129)];
        // T line = LINE_LENGTH_PCK [pixels/line] × IVTPXCK_period / 4 (Total number of image pipe lines)
        float sensorIvtpxckPeriod = 1.0 / 209.925;
        float sensorTline = sensorLineLengthPck * sensorIvtpxckPeriod / 4;
        float sensorTsh = sensorTline * (sensorCoarseIntegTime + (sensorFineIntegTime / sensorLineLengthPck));
        printf("Sensor Exposure Time: %.3f ms\n", sensorTsh / 1000.0);

        // FPS calculation
        // fps = pix_rate / pix_tot
        // pix_rate = IVTPXCK * 4
        // pix_tot = FRM_LENGTH_LINES * LINE_LENGTH_PCK
        uint32_t sensorFrmLengthLines = (uint16_t)sensorMetadata[embeddedDataIndex(123)] * 256 + (uint16_t)sensorMetadata[embeddedDataIndex(125)];
        uint64_t pixRate = 209925000 * 4;
        uint64_t pixTot = sensorFrmLengthLines * sensorLineLengthPck;
        float sensorFps = pixRate / pixTot;
        printf("Sensor Frame Rate: %.3f\n", sensorFps);

        delete[] sensorMetadata;
    }

    // Stop the repeating request and wait for idle
    iCaptureSession->stopRepeat();
    iCaptureSession->waitForIdle();

    // Destroy the output stream (stops consumer threads)
    previewStream.reset();

    // Wait for the consumer thread to complete
    PROPAGATE_ERROR(previewConsumerThread.shutdown());

    // Shut down Argus
    cameraProvider.reset();

    // Shut down the window (destroys window's EGLSurface)
    window.shutdown();

    // Cleanup the EGL display
    PROPAGATE_ERROR(g_display.cleanup());

    PRODUCER_PRINT("Done -- exiting.\n");

    return true;
}

}; // namespace ArgusSamples

int main(int argc, char** argv)
{
    ArgusSamples::MyAppOptions options(basename(argv[0]));
    if (!options.parse(argc, argv))
        return EXIT_FAILURE;
    if (options.requestedExit())
        return EXIT_SUCCESS;

    if (!ArgusSamples::execute(options))
        return EXIT_FAILURE;

    return EXIT_SUCCESS;
}

I did try argus_camera and I can set the frame rate to 30fps and the exposure time to 30ms however the exposure time stays at 16.491ms.

If I try an exposure of 25ms instead of 30ms, then I see the actual exposure time flickering between 20ms and 25ms. Both with my program and with argus_camera.

And at 20ms it is fine on argus_camera but stay at 16.491ms with my program.

Regards,
Ben

Could you fixed the gain by set the gain range to check if the flickering between 20ms and 25ms?

Yes, setting the gain to a fixed value does not change anything. This is really a change in the exposure time.

If than it could be the exposure REG setting have not linearity for the 20ms to 25ms.

Hi,

I upgraded my board from JetPack 4.2.2 to JetPack 4.4 and the flickering problem with argus_camera disapeared. I was not able to reproduce my problem.

However with my own application the problem is still there : impossible to set a manual exposure time of 30ms at 30fps. I see in dmesg that the exposure time is set back to 16ms (max exposure time at 60fps, which is the default value of the sensor driver). And the call is not done in my application (see the code I attached to my earlier message), nor in the driver (no problem using v4l2 commands directly), so it must be the argus library, mustn’t it ?

I did a small change in my application, replacing:
iSourceSettings->setExposureTimeRange(Range<uint64_t>(userExposureTime));
with:
iSourceSettings->setExposureTimeRange(Range<uint64_t>(userExposureTime - 1000, userExposureTime + 1000));
and now I can see the correct exposure time being set after a few frames for which the exposure time is still wrong.

In dmesg, this is what I can see:

[11745.284257] 30-001a: set_exposure: integration time: 29998 [us]
[11745.285935] 30-001a: set_exposure: set integration time: 29998 [us], coarse1:5557 [line], frame length: 6176 [line]
[11745.287464] 30-001a: set_frame_rate: val: 30000001, frame_length set: 6176
[11745.299692] 30-001a: set_exposure: integration time: 29998 [us]
[11745.300652] 30-001a: set_exposure: set integration time: 16570 [us], coarse1:3069 [line], frame length: 3092 [line]
**[11745.302694] 30-001a: set_frame_rate: val: 59925007, frame_length set: 3092**
[11745.320731] 30-001a: set_exposure: integration time: 16570 [us]
[11745.322227] 30-001a: set_exposure: set integration time: 16570 [us], coarse1:3069 [line], frame length: 6176 [line]
[11745.323247] 30-001a: set_frame_rate: val: 30000001, frame_length set: 6176
[11745.353378] 30-001a: set_exposure: integration time: 29999 [us]
[11745.354351] 30-001a: set_exposure: set integration time: 29999 [us], coarse1:5558 [line], frame length: 6176 [line]

I am still looking for an explanation as to why the frame rate is set back to 60fps even if I set it to 30fps?
How come I cannot set all the parameters before starting the streaming?

Not tried since a few releases, so not sure but you may try to:

  • set fixed gain
  • set fixed digitalgain
  • set awblock to true and wbmode to off (or manual, not sure)
  • set aelock to true and set exposuretimerange restricted to your target value.

An old example here with gstreamer.

I added fixed gain on the sensor and the ISP.
I set the white balance to AWB_MODE_OFF and AwbLock to true.
But the AeLock doesn’t help : it locks the value of the first image, which has the wrong exposure time.