Libargus - Limitations of RAW16 capture

Hello everyone,

we are currently integrating MIPI cameras into a system using a Jetson AGX Orin/JP 5.1.2.
We want to use libargus, which we’ve previously used to capture YUV data.

However, now we want to receive RAW16 data. I read of some limitations or pitfalls in some other threads so I have some questions:

  1. The sensor supports up to 16 bit. Are there any limitations by using libargus? I read that the ISP cannot handle RAW16 as input. What’s the maximum bit depth on the input side?
  2. Can we use this approach to capture multiple cameras (Argus MultiSession module)?
  3. Will the ISP take over Auto-Exposure/Auto-Gain in RAW mode?
  4. Is it possible to get whitebalance parameters from the ISP in RAW mode?
  5. Can features like sharpening, denoising etc. be utilized somehow in RAW mode?

As our driver implementation is not fully finished yet I cannot try the sample code (cudaBayerDemosaic) but I will as soon as I can.

Best regards and thanks for your help
Markus

If you are trying to verify driver, I think you should use v4l2-ctl first.
Using v4l2-ctl to capture a Raw 16 image require kernel modifying. Ref Here

V4L2-Ctl will capture image from /dev/video0. If your driver is writing correctly, you should be able to capture Raw 16 images.

But, when it comes to ISP, I am not sure if we (users) should tune the ISP or not. I’ll leave it to NVIDIA to answer it. And so is libargus since it’s not opensource.

Yes, with ISP pipeline(Argus) don’t support RAW16.
Current default support RAW12 and with patch able support RAW14.

Thank you for your answers.

@jameskuo
Yes, v4l2-ctl will be the tool of choice to verify the driver integration.
However, the question was more focused on the use of the cameras in our final application. This will be computationally intensive, so we want to offload as much as possible to external components.

@ShaneCCC
Thank you for the clarification. Is the patch to RAW14 available for download?

Can you please also explain which features from the ISP can be used, if we capture RAW data with libargus? We will use four 4k cameras in parallel and want to offload as much as possible to the ISP.

Best regards
Markus

Check below for it. It’s only for r35.3.1

Hello @ShaneCCC,

thank you for providing the patch.

Can you please also answer this part of the question, so I can resolve this thread?

Thank you!

Best regards
Markus

The ISP pipeline would do the demosaic and AE/AW …
You can check the MMAPI for the detail.

https://docs.nvidia.com/jetson/l4t-multimedia/group__LibargusAPI.html

Hello @ShaneCCC,

we explicitly want to use the PIXEL_FMT_RAW16 format, like in the cudaBayerDemosaic example.
I just need to know if the following features are supported in the RAW mode:

Feature Supported
Auto Exposure ?
Auto Gain ?
Auto Whitebalance ?
Antibanding ?
Edge Enhancement ?
Denoising ?
CaptureMetadata ?

I played around with the cudaBayerDemosaic example and tested some parameters:

  • Auto Exposure / Auto Gain seemed to work
  • Antibanding had not the desired outcome, as there was still flickering indoors
  • Edge Enhancement didn’t seem to work
  • I did not manage to get CaptureMetadata from the frame.
    According to the documentation there should be a getBayerHistogram() method which I’ d expect to be applicable in this case. And also parameters like exposure and gain should be included in this Metadata information.

Thank you for your answer.

Best regards
Markus

For the argus PIXEL_FMT_RAW16 suppose only the AE(exposure/gain) control are working.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.