using argus w/DOL hdr acquisition

Is there any documentation on how the DOL HDR mode works? I know basic concept of camera has 2 exposures that occur in rapid succession. How does argus handle this? Are there 2 streams available? Is the data from the 2 streams combined? I’m using Leopard Imaging’s IMX274 which has the 2-exposure mode working, and I see a frame in argus_camera (looks a little washed out). But would like some details on how I can control the WDR setttings (what are relative exposures etc.) and how to access the data (whether it’s high bit-depth single buffer or multiple streams). I tried reading through documentation but didn’t see much on DOL-HDR.

The driver seems fine, I’m more interested in how the argus api handles the hdr data, whether fusion is done, or the streams are separate. The camera partner told me to ask the forum on how this works, the hardware seems to be working correctly.

The HDR fusion code are not public. For the image quality need Leopard to help to do the tuning process.
However current we still working with camera partner to help to improve it.

Ok I will check with them.

Is it possible to acquire the 2 streams and then do processing elsewhere? i.e. similar how you can queue up multi-exposure streams in the argus_camera app, except using DOL-HDR so sync is better between the two streams