Adjusting exposure range with Leopard Imaging IMX264

Using the out-of-the-box argus_camera application with a Leopard Imaging IMX264 camera and adjusting the exposure time I do not see the image brighten or dim. Here are 2 examples changing the exposure time and switching the auto exposure toggle. Turning on/off the auto exposure lock does not seem to change the results. It seems no matter what I change, the brightness of the image is rather constant.

https://drive.google.com/open?id=1FBfug5OtCA4tLwW3fx1xjRVuhkxNo9Ku

https://drive.google.com/open?id=1DO4OHtw7dltBxWJPcRx3a9F7sP_YED8H

To follow this up, with the stock camera, we get exactly what we would expect decreasing the exposure time:

https://drive.google.com/open?id=0B4f0irN0M8daTGZ5YkRCeklGQU5HY19Tckh1MTBkdXJSc2RB

https://drive.google.com/open?id=0B4f0irN0M8dadnFMNTVMOUNzMHNESy1SSld6UlQzaWNTZ0lB

Change the ev(Exposure compensation) should be more like what you want.

Thanks Shane,

What I am really looking for is the camera to obey the exposure settings I ask it. If we ask for 200,000 ns of exposure, I would expect the camera to obey this. Modifying the one shot example to include:

status = iSourceSettings->setExposureTimeRange(Range<uint64_t>(200000, 200000));
EXIT_IF_NOT_OK(status, "Failed to get setExposureTimeRange");

among other minor changes, we look at Metadata that comes back with the image and the event timestamps we see an output of:

Argus Version: 0.96.2 (multi-process)
Found: 1 cameras
Testing capture
Sensor resolution: 2464 x 2058
Exposure is set to: 200000 ns to 200000 ns
Capture request sent: 1
Acquiring image...
Frame received!
Event queue size: 2, expected: 2
Saw Capture Started Event at: 2514671711
Saw capture complete event at: 2514907270, status: 0
 ExposureTime: 199999 ns
 Readout Time: 0 ns
 Analog Gain: 1.000000
 ISP Gain: 1.000000
 Exposure locked: 0
 ISO: 100
 Sensor timestamp: 2514671711000 ns
 Time delta between capture started and completed events: 235559 ns
Saving image

This leads me to believe the exposure was 200 usec as requested based on the meta data and difference in start/completed event time delta.

Yet when we use a timed LED sequence (100 usec on, then 200 usec), one at a time for each of 3 LEDs, looping around to determine the exposure length, we see all 3 LEDs on, which suggests the exposure is far greater than 200 usec.

https://drive.google.com/open?id=1j62vI9sRlZQ8fZOHdewyF3EQkX3mSSsH

This was verified by a FLIR POE camera using the mono version of the same sensor (IMX264) set to an exposure of 200 usec (188 actual) taken at the same time:

https://drive.google.com/open?id=1xsW4hpN3gJbKfzTKxVinWwsmpT5DZ7_M

If the exposure time is correct, we should never see 2 LEDs on at the same time. The LED on/off sequence timing was validated on a scope and is correct. The FLIR mono camera confirms this. This sensor is clearly capable of the exposure. Either libArgus/the driver is not doing what we ask, or I am doing something wrong. Interestingly, the stock camera that comes with the Jetson TX2 dev kit, seems to obey the exposure we ask for (but isn’t capable of 200 usec with the rolling shutter)

Edit: Here is the source, if you are interested:

#include <stdio.h>
#include <Argus/Argus.h>
#include <EGLStream/EGLStream.h>

using namespace Argus;

#define EXIT_IF_NULL(val,msg)   \
        {if (!val) {printf("%s\n",msg); return 1;}}

#define EXIT_IF_NOT_OK(val,msg) \
        {if (val!=STATUS_OK) {printf("%s\n",msg); return 1;}}

int testCapture(ICaptureSession* iSession);

int main(int argc, char**argv) {
	std::vector<CameraDevice*> cameraDevices;

	// create a camera provider
	UniqueObj<CameraProvider> cameraProvider(CameraProvider::create());
	ICameraProvider* iCameraProvider = interface_cast<ICameraProvider>(cameraProvider);
	EXIT_IF_NULL(iCameraProvider, "Could not get camera provider");
	printf("Argus Version: %s\n", iCameraProvider->getVersion().c_str());

	// get the status of the provider
	Argus::Status status = iCameraProvider->getCameraDevices(&cameraDevices);
	EXIT_IF_NOT_OK(status, "Failed to get camera devices\r\n");
	EXIT_IF_NULL(cameraDevices.size(), "No camera devices available");

	printf("Found: %d cameras\r\n", static_cast<int>(cameraDevices.size()));
	for( auto iter : cameraDevices )
	{
		// start a capture session for each camera
		UniqueObj<CaptureSession> captureSession(iCameraProvider->createCaptureSession(iter, &status));
		EXIT_IF_NOT_OK(status, "Failed to createCaptureSession()");

		IEventProvider *iEventProvider = interface_cast<IEventProvider>(captureSession);
		EXIT_IF_NULL(iEventProvider, "iEventProvider is NULL");

		// these are the events we are interested
		std::vector<EventType> eventTypes;
		eventTypes.push_back(EVENT_TYPE_CAPTURE_STARTED);
		eventTypes.push_back(EVENT_TYPE_CAPTURE_COMPLETE);

		// create an event queue
		UniqueObj<EventQueue> queue(iEventProvider->createEventQueue(eventTypes));
		IEventQueue *iQueue = interface_cast<IEventQueue>(queue);
		EXIT_IF_NULL(iQueue, "event queue interface is NULL");

		ICaptureSession* iSession = interface_cast<ICaptureSession>(captureSession);
		EXIT_IF_NULL(iSession, "Cannot get ICaptureSession");

		printf("Testing capture\r\n");

		// create stream settings
		Argus::Status status = STATUS_OK;
		UniqueObj<OutputStreamSettings> streamSettings(iSession->createOutputStreamSettings());
		IOutputStreamSettings* iStreamSettings = interface_cast<IOutputStreamSettings>(streamSettings);
		EXIT_IF_NULL(iStreamSettings, "Cannot get OutputStreamSettings Interface");
		iStreamSettings->setPixelFormat(PIXEL_FMT_YCbCr_420_888);
		iStreamSettings->setResolution(Size2D<uint32_t>(2456, 2054));
		iStreamSettings->setMetadataEnable(true);

		// create a stream from the stream settings
		UniqueObj<OutputStream> stream(iSession->createOutputStream(streamSettings.get()));
		IStream *iStream = interface_cast<IStream>(stream);
		EXIT_IF_NULL(iStream, "Cannot get OutputStream Interface");

		// create a consumer for the stream
		UniqueObj<EGLStream::FrameConsumer> consumer(EGLStream::FrameConsumer::create(stream.get()));
		EGLStream::IFrameConsumer* iFrameConsumer = interface_cast<EGLStream::IFrameConsumer>(consumer);
		EXIT_IF_NULL(iFrameConsumer, "Failed to initialize Consumer");

		// create a capture request
		UniqueObj<Request> request(iSession->createRequest(CAPTURE_INTENT_STILL_CAPTURE));
		IRequest *iRequest = interface_cast<IRequest>(request);
		EXIT_IF_NULL(iRequest, "Failed to get capture request interface");

		IAutoControlSettings* iAutoControlSettings = interface_cast<IAutoControlSettings>(iRequest->getAutoControlSettings());
		EXIT_IF_NULL(iAutoControlSettings, "Failed to get IAutoControlSettings");

		status = iAutoControlSettings->setAeAntibandingMode( AE_ANTIBANDING_MODE_OFF);
		EXIT_IF_NOT_OK(status, "Failed to setAeAntibandingMode()");

		status = iAutoControlSettings->setAeLock(true);
		EXIT_IF_NOT_OK(status, "Failed to setAeLock()");

		status = iAutoControlSettings->setAwbLock(true);
		EXIT_IF_NOT_OK(status, "Failed to setAwbLock()");

		status = iAutoControlSettings->setExposureCompensation(0.0f);
		EXIT_IF_NOT_OK(status, "Failed to setExposureCompensation()");

		status = iAutoControlSettings->setIspDigitalGainRange(Range<float>(1.0f,1.0f));
		EXIT_IF_NOT_OK(status, "Failed to setIspDigitalGainRange()");

		ISourceSettings* iSourceSettings = interface_cast<ISourceSettings>(iRequest->getSourceSettings());
		EXIT_IF_NULL(iRequest, "Failed to get ISourceSettings");

		status = iSourceSettings->setExposureTimeRange(Range<uint64_t>(200000, 200000));
		EXIT_IF_NOT_OK(status, "Failed to get setExposureTimeRange");

		status = iSourceSettings->setGainRange(Range<float>(1.0f, 1.0f));
		EXIT_IF_NOT_OK(status, "Failed to get setGainRange");

		ISensorMode *iSensorMode = interface_cast<ISensorMode>(iSourceSettings->getSensorMode());
		EXIT_IF_NULL(iSensorMode, "Failed to get sensor mode interface");
		Argus::Size2D<uint32_t> sensorResolution = iSensorMode->getResolution();
		printf("Sensor resolution: %d x %d\r\n", sensorResolution.width(), sensorResolution.height() );

		// what is the range
		Range<uint64_t> range = iSourceSettings->getExposureTimeRange();
		printf("Exposure is set to: %lu ns to %lu ns\r\n", range.min(), range.max());

		// not really sure why we need to enable the output stream on the request....
		status = iRequest->enableOutputStream(stream.get());
		EXIT_IF_NOT_OK(status, "Failed to enable stream in capture request");

		// send a capture request
		uint32_t requestId = iSession->capture(request.get());
		printf("Capture request sent: %d\r\n", requestId);

		// receive the frame
		printf("Acquiring image...\r\n\r\n");
		UniqueObj<EGLStream::Frame> frame(iFrameConsumer->acquireFrame(5000000000, &status)); // 5 seconds in nanoseconds
		printf("Frame received!\r\n");
		EGLStream::IFrame *iFrame = interface_cast<EGLStream::IFrame>(frame);
		EXIT_IF_NULL(iFrame, "Failed to get IFrame interface");

		// wait for the capture event
		const uint64_t ONE_SECOND = 1000000000;
		iEventProvider->waitForEvents(queue.get(), 5*ONE_SECOND);

		uint64_t frameStartEventTime = 0;

		size_t queueSize = iQueue->getSize();
		printf("Event queue size: %d, expected: %d\r\n", (int)queueSize, 2);
		const Event* event = iQueue->getNextEvent();
		while(event) {
			const IEvent* iEvent = interface_cast<const IEvent>(event);
			if(iEvent->getEventType() == EVENT_TYPE_CAPTURE_STARTED )
			{
				frameStartEventTime = iEvent->getTime();
				printf("Saw Capture Started Event at: %lu\r\n", frameStartEventTime);

			} else if(iEvent->getEventType() == EVENT_TYPE_CAPTURE_COMPLETE )
			{
				const IEventCaptureComplete* iEventCaptureComplete = interface_cast<const IEventCaptureComplete>(event);
				if( iEventCaptureComplete ) {
					printf("Saw capture complete event at: %lu, status: %d\r\n", iEvent->getTime(), iEventCaptureComplete->getStatus());
					const CaptureMetadata *metaData = iEventCaptureComplete->getMetadata();
					const ICaptureMetadata* iMetadata = interface_cast<const ICaptureMetadata>(metaData);

					printf( "\tExposureTime: %lu ns\r\n"
							"\tReadout Time: %lu ns\r\n"
							"\tAnalog Gain: %f\r\n"
							"\tISP Gain: %f\r\n"
							"\tExposure locked: %d\r\n"
							"\tISO: %d\r\n"
							"\tSensor timestamp: %lu ns\r\n"
							"\tTime delta between capture started and completed events: %lu ns\r\n",
							iMetadata->getSensorExposureTime(),
							iMetadata->getFrameReadoutTime(),
							iMetadata->getSensorAnalogGain(),
							iMetadata->getIspDigitalGain(),
							iMetadata->getAeLocked(),
							iMetadata->getSensorSensitivity(),
							iMetadata->getSensorTimestamp(),
							iEvent->getTime() - frameStartEventTime
					);
				} else
				{
					printf("EVENT_TYPE_CAPTURE_COMPLETE event is not type IEventCaptureComplete\r\n");
				}
			} else {
				printf("Saw UNKNOWN event\r\n");
			}
			event = iQueue->getNextEvent();
		}

		EGLStream::Image *image = iFrame->getImage();
		EXIT_IF_NULL(image, "Failed to get Image from iFrame->getImage()");

		EGLStream::IImageJPEG *iImageJPEG = interface_cast<EGLStream::IImageJPEG>(image);
		EXIT_IF_NULL(iImageJPEG, "Failed to get ImageJPEG Interface");

		printf("Saving image\r\n");
		status = iImageJPEG->writeJPEG("image.jpg");
		EXIT_IF_NOT_OK(status, "Failed to write JPEG");

		captureSession.reset();
	}
}

Could you have sensor output the REG setting to check if any change when the API set.