Request for Argus API to standby image sensor without destroying libargus objects

Hello,
I have an Argus application which captures a burst sequence (using the ICaptureSession::captureBurst() API) of 5 frames from the image sensor every 3 seconds.

Using an inline power measurement tool I can see that the sensor is on and continuously sending frames to the Jetson Nano in between each burst capture set, even though my application is waiting in thread sleep. I’ve looked for an Argus API to allow sending sensor to standby but don’t see one, and I’ve tried calling IRequest::disableOutputStream and IRequest::enableOutputStream in between captures but didn’t see any difference in power draw.

So far it seems the only way to send sensor into standby is through destroying the Argus CaptureSession object and all its Request/OutputStream objects. However based on power measurements it takes several Joules of CPU activity to destroy/instantiate a new CaptureSession and associated objects.

Can you share any methods to send the sensor into standby mode without destroying the Argus CaptureSession? Or whether there are plans to provide new Argus APIs to quickly enter/exit the image sensor’s standby mode?

This graph illustrates the total system (Jetson Nano + image sensor) measured power draw over idle power in blue (watts), and energy consumed over time in red (joules) for a sequence of 4 burst captures. This is testing with the Jetson Nano and the raspberry pi v2 sensor, using a burst capture sample application like the one pasted below.

As can be seen from the graph, the middle two burst capture takes ~2 Joules, with the first and last capture taking ~2.5-3 Joules due to inclusion of Argus object initialization/deconstruction. In between captures there is a lower power draw but with a 5Hz ‘tick’ that corresponds to the 200ms frame duration requested for the last burst capture. If I use a very long thread sleep between each burst capture, and disconnect the image sensor’s power independently of the system, I see a crash/timeout occurring in libargus, which seems to indicate libargus is still expecting/receiving incoming MIPI data between my burst captures.

Sample burst capture application

#include "ArgusHelpers.h"
#include "CommonOptions.h"
#include "Error.h"
#include "Thread.h"

#include <Argus/Argus.h>
#include <EGLStream/EGLStream.h>

#include <unistd.h>
#include <stdio.h>
#include <stdlib.h>
#include <sys/stat.h>
#include <sys/types.h>
#include <fcntl.h>

#include <iostream>


using namespace Argus;
using namespace EGLStream;

namespace ArgusSamples
{

// Debug print macros.
#define PRODUCER_PRINT(...) printf("PRODUCER: " __VA_ARGS__)
#define CONSUMER_PRINT(...) printf("CONSUMER: " __VA_ARGS__)

/*******************************************************************************
 * FrameConsumer thread:
 *   Creates a FrameConsumer object to read frames from the OutputStream, then
 *   acquires and prints frame info from the IImage and IImage2D interfaces
 *   while frames are presented. Writes bayer images to file
 ******************************************************************************/
class ConsumerThread : public Thread
{
public:
    explicit ConsumerThread(std::vector<OutputStream*> streams, bool& exit) :
        m_streams(streams), m_exit(exit)
    {
    }
    ~ConsumerThread()
    {
    }

private:
    /** @name Thread methods */
    /**@{*/
    virtual bool threadInitialize();
    virtual bool threadExecute();
    virtual bool threadShutdown();
    /**@}*/

    std::vector<OutputStream*> m_streams;
    std::vector<FrameConsumer*> m_consumers;
    bool& m_exit;
};

bool ConsumerThread::threadInitialize()
{
    // Create the FrameConsumer.
    for (OutputStream* stream : m_streams)
    {
        CONSUMER_PRINT("Creating stream\n");
        m_consumers.push_back(FrameConsumer::create(stream));
    }

    return true;
}

bool ConsumerThread::threadExecute()
{
    for (OutputStream* stream : m_streams)
    {
        IEGLOutputStream *iStream = interface_cast<IEGLOutputStream>(stream);

        // Wait until the producer has connected to the stream.
        if (iStream->waitUntilConnected() != STATUS_OK)
            ORIGINATE_ERROR("Stream failed to connect.");
    }

    uint32_t frameNum = 0;
    while (true)
    {
        if (m_exit)
            break;

        for (FrameConsumer* consumer : m_consumers)
        {
            IFrameConsumer *iFrameConsumer = interface_cast<IFrameConsumer>(consumer);
            UniqueObj<Frame> frame(iFrameConsumer->acquireFrame());

            // Use the IFrame interface to print out the frame number/timestamp, and
            // to provide access to the Image in the Frame.
            IFrame *iFrame = interface_cast<IFrame>(frame);
            if (!iFrame)
                ORIGINATE_ERROR("Failed to get IFrame interface.");
            CONSUMER_PRINT("Acquired Frame: %llu, time %llu\n",
                        static_cast<unsigned long long>(iFrame->getNumber()),
                        static_cast<unsigned long long>(iFrame->getTime()));

            // Print out some capture metadata from the frame.
            IArgusCaptureMetadata *iArgusCaptureMetadata = interface_cast<IArgusCaptureMetadata>(frame);
            if (!iArgusCaptureMetadata)
                ORIGINATE_ERROR("Failed to get IArgusCaptureMetadata interface.");
            CaptureMetadata *metadata = iArgusCaptureMetadata->getMetadata();
            ICaptureMetadata *iMetadata = interface_cast<ICaptureMetadata>(metadata);
            if (!iMetadata)
                ORIGINATE_ERROR("Failed to get ICaptureMetadata interface.");
            CONSUMER_PRINT("\tSensor Timestamp: %llu, LUX: %f\n",
                        static_cast<unsigned long long>(iMetadata->getSensorTimestamp()),
                        iMetadata->getSceneLux());

            // Print out image details, and map the buffers to read out some data.
            Image *image = iFrame->getImage();
            IImage *iImage = interface_cast<IImage>(image);
            IImage2D *iImage2D = interface_cast<IImage2D>(image);

            const uint8_t *data_ptr = static_cast<const uint8_t*>(iImage->mapBuffer());
            if (!data_ptr)
                ORIGINATE_ERROR("\tFailed to map buffer\n");

            Size2D<uint32_t> size = iImage2D->getSize(0);
            const uint16_t *d = static_cast<const uint16_t*>(iImage->mapBuffer());
            CONSUMER_PRINT("\tIImage(2D): "
                            "buffer (%ux%u, %u stride), "
                            "%02d %02d %02d %02d %02d %02d %02d %02d %02d %02d %02d %02d\n",
                            size.width(), size.height(), iImage2D->getStride(0),
                            d[10000], d[10001], d[10002], d[10003], d[10004], d[10005],
                            d[10006], d[10007], d[10008], d[10009], d[10010], d[10011]);

        }
    }

    CONSUMER_PRINT("Done.\n");

    PROPAGATE_ERROR(requestShutdown());

    return true;
}

bool ConsumerThread::threadShutdown()
{
    for (FrameConsumer* frameConsumer : m_consumers)
    {
        frameConsumer->destroy();
    }
    return true;
}

/*******************************************************************************
 * Argus Producer thread:
 *   Opens the Argus camera driver, creates an OutputStream to output to a
 *   FrameConsumer, then performs repeating capture requests for CAPTURE_TIME
 *   seconds before closing the producer and Argus driver.
 ******************************************************************************/
static bool execute(const CommonOptions& options)
{
    // Create the CameraProvider object and get the core interface.
    UniqueObj<CameraProvider> cameraProvider = UniqueObj<CameraProvider>(CameraProvider::create());
    ICameraProvider *iCameraProvider = interface_cast<ICameraProvider>(cameraProvider);
    if (!iCameraProvider)
        ORIGINATE_ERROR("Failed to create CameraProvider");
    printf("Argus Version: %s\n", iCameraProvider->getVersion().c_str());

    // Get the selected camera device and sensor mode.
    CameraDevice* cameraDevice = ArgusHelpers::getCameraDevice(
            cameraProvider.get(), options.cameraDeviceIndex());
    if (!cameraDevice)
        ORIGINATE_ERROR("Selected camera device is not available");
    SensorMode* sensorMode = ArgusHelpers::getSensorMode(cameraDevice, options.sensorModeIndex());
    ISensorMode *iSensorMode = interface_cast<ISensorMode>(sensorMode);
    if (!iSensorMode)
        ORIGINATE_ERROR("Selected sensor mode not available");

    // Create the capture session using the selected device and get the core interface.
    UniqueObj<CaptureSession> captureSession(iCameraProvider->createCaptureSession(cameraDevice));
    ICaptureSession *iCaptureSession = interface_cast<ICaptureSession>(captureSession);
    if (!iCaptureSession)
        ORIGINATE_ERROR("Failed to get ICaptureSession interface");

    // Create the OutputStream.
    PRODUCER_PRINT("Creating output stream\n");
    UniqueObj<OutputStreamSettings> streamSettings(
        iCaptureSession->createOutputStreamSettings(STREAM_TYPE_EGL));
    IEGLOutputStreamSettings *iStreamSettings =
        interface_cast<IEGLOutputStreamSettings>(streamSettings);
    if (iStreamSettings)
    {
        iStreamSettings->setPixelFormat(PIXEL_FMT_RAW16);
        iStreamSettings->setResolution(iSensorMode->getResolution());
        iStreamSettings->setMode(EGL_STREAM_MODE_FIFO);
        iStreamSettings->setMetadataEnable(true);
    }

    std::vector<OutputStream*> outputStreams;
    std::vector<const Request*> requests;
    std::vector<std::pair<OutputStream*, Request*>> outputRequestPairs;

    std::vector<std::pair<uint64_t,uint64_t>> nsec_exp_dur_pairs =
    {
        {781250,     50000000},
        {3125000,    50000000},
        {12500000,   50000000},
        {50000000,   50000000},
        {200000000,  200000000},
    };

    for (std::pair<uint64_t,uint64_t> nsec_exp_dur: nsec_exp_dur_pairs)
    {
        uint64_t nsec_exp = nsec_exp_dur.first;
        uint64_t nsec_dur = nsec_exp_dur.second;

        // Launch the FrameConsumer thread to consume frames from the OutputStream.
        PRODUCER_PRINT("Create output stream and request for for %llu nsec exposure\n", static_cast<unsigned long long>(nsec_exp));

        // create output stream and request, and enable output stream on request
        OutputStream* outputStream = iCaptureSession->createOutputStream(streamSettings.get());
        Request* request = iCaptureSession->createRequest(Argus::CAPTURE_INTENT_STILL_CAPTURE);
        outputRequestPairs.push_back(std::make_pair(outputStream, request));
        IRequest *iRequest = interface_cast<IRequest>(request);
        if (!iRequest)
            ORIGINATE_ERROR("Failed to create Request");
        iRequest->enableOutputStream(outputStream);

        // lock AE
        IAutoControlSettings *iAutoControlSettings =
            interface_cast<IAutoControlSettings>(iRequest->getAutoControlSettings());
        if (!iAutoControlSettings)
            ORIGINATE_ERROR("Failed to get IAutoControlSettings interface");
        if (iAutoControlSettings->setAeLock(true) != Argus::STATUS_OK)
            ORIGINATE_ERROR("Failed to set AE lock");
        if (iAutoControlSettings->setIspDigitalGainRange(Range<float>(1.0)) != Argus::STATUS_OK)
            ORIGINATE_ERROR("Failed to set digital gain range");

        // Configure source settings in the request.
        ISourceSettings *iSourceSettings = interface_cast<ISourceSettings>(request);
        if (!iSourceSettings)
            ORIGINATE_ERROR("Failed to get source settings request interface");
        iSourceSettings->setSensorMode(sensorMode);
        Range<uint64_t> durRange = Range<uint64_t>(nsec_dur);
        iSourceSettings->setFrameDurationRange(durRange);
        Range<uint64_t> expRange = Range<uint64_t>(nsec_exp);
        iSourceSettings->setExposureTimeRange(expRange);
        Range<float> gainRange = iSourceSettings->getGainRange();
        iSourceSettings->setGainRange(Range<float>(1.0));

        outputStreams.push_back(outputStream);
        requests.push_back(request);
    }

    bool exit = false;
    ConsumerThread frameConsumerThread(outputStreams, exit);
    PROPAGATE_ERROR(frameConsumerThread.initialize());
    // Wait until the consumer is connected to the stream.
    PROPAGATE_ERROR(frameConsumerThread.waitRunning());

    for (int i = 0 ; i < 3; i++)
    {
        PRODUCER_PRINT("Capture burst of length %lu, max burst length %u\n", requests.size(), iCaptureSession->maxBurstRequests());
        uint32_t requestId = iCaptureSession->captureBurst(requests);

        // wait three seconds between calls to captureBurst
        usleep(3000000);
    }

    // send capture request to wake consumer thread
    exit = true;
    PRODUCER_PRINT("Capture final burst of length %lu, max burst length %u\n", requests.size(), iCaptureSession->maxBurstRequests());
    uint32_t requestId = iCaptureSession->captureBurst(requests);

    // Wait for the consumer thread to complete.
    PROPAGATE_ERROR(frameConsumerThread.shutdown());


    for (OutputStream* outputStream : outputStreams)
    {
        // Destroy the output stream to end the consumer thread.
        outputStream->destroy();
    }

    for (const Request* request: requests)
    {
        // Destroy the output stream to end the consumer thread.
        Request *req = (Request*)request;
        req->destroy();
    }

    PRODUCER_PRINT("Done -- exiting.\n");

    return true;
}

}; // namespace ArgusSamples

int main(int argc, char** argv)
{
    ArgusSamples::CommonOptions options(basename(argv[0]),
                                        ArgusSamples::CommonOptions::Option_D_CameraDevice |
                                        ArgusSamples::CommonOptions::Option_M_SensorMode |
                                        ArgusSamples::CommonOptions::Option_T_CaptureTime);
    if (!options.parse(argc, argv))
        return EXIT_FAILURE;
    if (options.requestedExit())
        return EXIT_SUCCESS;

    for ( int i = 0; i < 1; i++) {
	    PRODUCER_PRINT("Argus execute %d\n", i);
        if (!ArgusSamples::execute(options))
            return EXIT_FAILURE;
    }

    return EXIT_SUCCESS;
}

Thanks for any help optimizing power usage of libargus

Current design can’t support your case.
May need have feature request to internal plan.

Thank you for the response. Is there any way I can formalize the feature request to receive updates and/or a projected implementation schedule?

Looks like for now I need to plan for workarounds.