Per the Nvidia release notes, there’s an exposure limit 400ms inside the Argus drivers. My use-case requires a long exposure (ideally up to the sensor limit of the IMX477 which I believe is ~239 seconds). I know of many other Jetson users (and Pi folks) who would love to use Jetson for long exposure still photography projects (ie. astrophotography).
It appears as though Argus (and V4L2) are heavily designed around a streaming frame model, and that makes it harder to build still photography applications. Streaming frames is great for showing a live preview, but when it comes to actually capturing a frame (with minimal shutter lag), it ideally would have much more discrete control of the frame timing. The folks at the Raspberry Pi Foundation have made a lot of progress with long exposure support this past year with both their
raspiraw tools, and it would be great to have similar support on the Jetson platform. Maybe the libcamera project could be an interesting chance to collaborate on an API that supports both still and video workflows,.
This was initially brought up in this Nvidia forum post but I was asked to post a new feature request specifically for long exposure support.
Another note on this more “still photography” oriented API, there are other optimizations that would be good to consider. Given the relaxed timing in still capture (ie. no streaming frames to worry about), it should be possible to tweak the ISP to produce high quality images. For example, Raspberry Pi is doing something very similar - See here for how they have concepts of
video ports. The
still port uses a stronger noise reduction algorithm compared to the
video port which results in higher quality appearing footage. I believe they may also be using a different debayer algorithm in their ISP for the
still port but I haven’t yet confirmed that. Ideally the
still capture pipeline should be focused on the highest possible quality output, and the
video pipeline may need to use reduced quality settings to maintain its pipeline timing constraints.
Please let me know if you need any further information on my use-case or if you have any questions / thoughts on the concept. Thanks!