[Feature Request] Long exposures, up to actual sensor limit, for supported CSI cameras (and other still photography related requests)

Per the Nvidia release notes, there’s an exposure limit 400ms inside the Argus drivers. My use-case requires a long exposure (ideally up to the sensor limit of the IMX477 which I believe is ~239 seconds). I know of many other Jetson users (and Pi folks) who would love to use Jetson for long exposure still photography projects (ie. astrophotography).

It appears as though Argus (and V4L2) are heavily designed around a streaming frame model, and that makes it harder to build still photography applications. Streaming frames is great for showing a live preview, but when it comes to actually capturing a frame (with minimal shutter lag), it ideally would have much more discrete control of the frame timing. The folks at the Raspberry Pi Foundation have made a lot of progress with long exposure support this past year with both their raspistill and raspiraw tools, and it would be great to have similar support on the Jetson platform. Maybe the libcamera project could be an interesting chance to collaborate on an API that supports both still and video workflows,.

This was initially brought up in this Nvidia forum post but I was asked to post a new feature request specifically for long exposure support.

Another note on this more “still photography” oriented API, there are other optimizations that would be good to consider. Given the relaxed timing in still capture (ie. no streaming frames to worry about), it should be possible to tweak the ISP to produce high quality images. For example, Raspberry Pi is doing something very similar - See here for how they have concepts of still and video ports. The still port uses a stronger noise reduction algorithm compared to the video port which results in higher quality appearing footage. I believe they may also be using a different debayer algorithm in their ISP for the still port but I haven’t yet confirmed that. Ideally the still capture pipeline should be focused on the highest possible quality output, and the video pipeline may need to use reduced quality settings to maintain its pipeline timing constraints.

Please let me know if you need any further information on my use-case or if you have any questions / thoughts on the concept. Thanks!

I would suggest to using external ISP and using v4l2-ctl or v4l2 API for it. At least you have fully source of the VI and sensor driver.

Ok, that could be an option for my use-case. Would you be able to point me in the right direction for the how to pass the relevant vertical and horizontal blanking periods and pixel clock timing settings to v4l2-ctrl? Or does that need to be done directly with the v4l2 API? Also, as has been noted on the forum, it appears that the IMX477 full sensor readout of 4032x3040 at 30fps hangs when running streaming frame capture with v4l2-ctl due to mipi bandwidth. Do you think it’s possible to get the full sensor readout to capture a single long exposure frame with v4l2-ctl? Thanks again.

Have a check below document for sensor driver implement.

And you may need some code modify to increase the timeout time for long exposure.

https://docs.nvidia.com/jetson/l4t/index.html#page/Tegra%20Linux%20Driver%20Package%20Development%20Guide/camera_sensor_prog.47.1.html#

Ok will do. The only issue I see with relying on v4l2 is that it’s still based on a streaming frame model. Ie. Even if I get it to support 120 second exposures, there is no way to initiate a capture at an arbitrary time. It needs to “catch” the next frame, which in the above case could be up to 120 seconds extra wait time. Is that correct?

If you want to support trigger capture that would be problem for current driver implement, current driver implement only support streaming mode.

Yeah, trigger capture would be required for my use-case, and most photography related workflows. Long exposure + streaming frames are somewhat incompatible concepts in practice due to the implied latency. Is that a feasible feature request? Thanks again!

You can try to modify the driver to check even official announce not support.

Ok, I will investigate that, but please do consider adding trigger and long exposure support (and ideally also quality focused ISP settings for still captures) into the official driver. It would be massively helpful for users requiring exact timing photography workflows.

1 Like