Long exposure time

@ShaneCCC
I’m not sure if I understand. The gstreamer and v4l2 uses the same driver, right ? So why only one of them works with modified driver code ?

I am confuse here. Do you mean the v4l2-ctl can’t working with fps >=1? but gstreamer worked.
For v4l2 and gstreamer the different is the frame rate/gain/exposure setting is different.

@ShaneCCC I tried v4l2 command provided by you with not modified driver and It didn’t work as well.

So v4l2 doesn’t work at all. Gstreamer works only for FPS >= 1.

You may need to make sure both of the sensor REG configure are the same for v4l2 and gstreamer.

@ShaneCCC I’m not sure if making v4l2 working will help me. I’m interested in increasing exposure time for gstreamer. It is worth to debug v4l2 ?

At least confirm with vendor if it’s working on they demo board to confirm the MIPI timing still as spec when modify it less than 1.

@ShaneCCC You suggested that ARGUS doesn’t support FPS < 1. I’ve tested it and It seems that it is supported. As I mentioned before there is an error like this:

fence timeout on [ffffffc08b98a780] after 1500ms

so I set FPS to don’t exceed that timeout i.e. 2/3 - one frame should be handled in 1,5s for this setting - and I was able to get frame successfully. The problem occurs when FPS value make time of getting frame longer than 1,5s.

I found the log about timeout in kernel source file: kernel/kernel-4.9/drivers/staging/android/sync.c
in function int sync_fence_wait(struct sync_fence *fence, long timeout) but I cannot find where it is called with 1500ms timeout value. It is possible that the libargus calls it ? As far as I know the ARGUS sources are not public ?

You can have below command to make the timeout as infinite.

sudo service nvargus-daemon stop
sudo enableCamInfiniteTimeout=1 nvargus-daemon

@ShaneCCC I’m getting following output for gst-launch (for now fps is 1):

gst-launch-1.0 -e         nvarguscamerasrc num-buffers=1 exposuretimerange="500000000 500000000" !         "video/x-raw(memory:NVMM),width=4032,height=3040,framerate=1/1" !         nvjpegenc quality=100 !         multifilesink location=./test.jpg
GST_ARGUS: NvArgusCameraSrc: Setting Exposure Time Range : 500000000 500000000
GST_ARGUS: !!! EXP RANGE: 500000000,000000, 500000000,000000
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Error generated. gstnvarguscamerasrc.cpp, execute:656 Failed to create CaptureSession
Caught SIGSEGV
Got EOS from element "pipeline0".
Execution ended after 0:00:02.086712276
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
#0  0x0000007f864f2310 in __GI___pthread_timedjoin_ex (threadid=547585204720, thread_return=0x0, abstime=0x0, block=<optimized out>)
#1  0x0000007f865ae968 in  () at /usr/lib/aarch64-linux-gnu/libglib-2.0.so.0
#2  0x0000007f85b94000 in  ()
Spinning.  Please run 'gdb gst-launch-1.0 13732' to continue debugging, Ctrl-C to quit, or Ctrl-\ to dump core.

And nvargus-daemon log:

=== NVIDIA Libargus Camera Service (0.97.3)=== Listening for connections...=== gst-launch-1.0[13540]: Connection established (7F8D0811D0)OFParserListModules: module list: /proc/device-tree/tegra-camera-platform/modules/module0
NvPclHwGetModuleList: No module data found
OFParserGetVirtualDevice: NVIDIA Camera virtual enumerator not found in proc device-tree
---- imager: No override file found. ----
=== gst-launch-1.0[13540]: CameraProvider initialized (0x7f8892d9c0)SCF: Error BadParameter:  (propagating from src/services/capture/NvViCsiHw.cpp, function openViCsi(), line 118)
SCF: Error BadParameter:  (propagating from src/services/capture/CaptureServiceDeviceViCsi.cpp, function open(), line 307)
SCF: Error BadParameter:  (propagating from src/services/capture/CaptureServiceDevice.cpp, function openSource(), line 355)
SCF: Error BadParameter:  (propagating from src/services/capture/CaptureService.cpp, function openSource(), line 478)
SCF: Error BadParameter:  (propagating from src/api/Session.cpp, function initialize(), line 263)
SCF: Error BadParameter:  (propagating from src/api/CameraDriver.cpp, function createSession(), line 577)
(Argus) Error BadParameter:  (propagating from src/api/CaptureSessionImpl.cpp, function initialize(), line 120)
(Argus) Error BadParameter:  (propagating from src/api/CameraProviderImpl.cpp, function createCaptureSession(), line 258)
(NvCameraUtils) Error InvalidState: Mutex not initialized (/dvs/git/dirty/git-master_linux/camera/argus/src/api/CaptureSessionImpl.cpp:197) (in Mutex.cpp, function lock(), line 79)
(Argus) Error InvalidState: Element not found (in /dvs/git/dirty/git-master_linux/camera/utils/nvcamerautils/inc/Vector.h, function remove(), line 172)
(NvCameraUtils) Error InvalidState: Mutex has not been initialized (in Mutex.cpp, function unlock(), line 88)

Only exposuretimerange=“500000000 500000000” failed?
I think when sensor configure as this value that cause the output timing to capture data from sensor.

Could you try set this exposuretimerange but modify sensor driver to set a normal value to cheat argus to confirm it.

@ShaneCCC I’ve tried to run

sudo enableCamInfiniteTimeout=1 nvargus-daemon

on system with not modified driver and the result was the same. It seems that it is not working at all. I’m using JetPack 4.5.1.

Also dmesg is not showing any messages from driver. I guess something crashes before any communication with camera sensor.

OOPS, sorry just aware Nano not support enableCamInfiniteTimeout

@ShaneCCC Ok, so are there any other options for my use case ?

Depending on what you are attempting you could stack exposures to emulate lone exposure.

Thanks but I need true long exposure to get frames at night. Supported exposure time and gain are not sufficent.

I think I saw previously that you can do 200 seconds with raspistill on the pi. Have you attempted to build the app from source onto the Jetson?

or

It is not possible to compile RaspiStill on Jetson Nano. It’s different API layer for hardware communication.

there is the nano still I posted but I was reading that the way long exposure is done even on the pi is that the software is stacking shorter exposures not doing one long shot.

Re: Raspistill slow for long exposures

Thu Jul 20, 2017 7:31 am

Raspistill actually has to take multiple images - to get exposure etc. Set. If you use a little python loop with capture continuous you can reduce the gap between images to <1 second. Also note you can use up to 10 secs exposure time.

I implemented long exposure in kernel driver basing on IMX477 driver for raspberry pi. The problem lays somewhere in nvidia software calling the driver API, I guess.

Have there been any updates on this front, particularly for the Nano + IMX477? My use-case requires exposures longer than 5 seconds. Thanks!