Set Digital Gain of raw image capture

I’m writing an application which will run my custom written software ISP modules on GPU. For capturing raw image I’m using libargus API. I was looking at the cudaBayerDemosaic and userAlteringAutoExposure examples given in jetson multimedia argus samples. I’ve the following questions:

  1. Does libargus API completely skips the hardware ISP on jetson board? (in cudaBayerDemosaic example or any other sample code)
  2. In userAlteringAutoExposure, it set’s the digital gain of hardware ISP using ac->setIspDigitalGainRange, Is there a way to set the onchip digital gain of sensor using libargus? (I’m using imx219 and have also attached ss of the onchip processing pipeline from imx219 datasheet)
  1. Suppose yes.
  2. Using setGainRange() to set sensor gain.

Thanks!

  1. All the APIs available in libargus e.g., setHistogramRegion, setAWBregion, setWBgains,setToneMapCurve etc. all configure the hardware ISP on jetson board right? As I’m a little confused thinking these APIs implement the complete modules in software and run them on jetson GPU 🫠
  2. I was under the impression that there are 3 gains. Analog gain which is an onchip sensor property, digital gain (another onchip sensor property) and digital gain inside the jetson board ISP. setIspDigitalGainRange changes the digital gain of the hardware ISP and setGainRange() changes the onchip analog gain of sensor. Is there as way to access the onchip digital gain of sensor? (or other onchip properties like black level adjust, defect correction etc. as given in the datasheet section 6.
  1. The most of them are run on CPU.
  2. SetGainRange is using for setting the sensor gain.

@ShaneCCC Thanks for the reply!!

SetGainRange is for setting the sensor analog gain only. I was asking for the sensor digital gain. Here I found similar issue on the nvidia forum. From what I understand is that there’s no API in libargus to cotrol the sensor digital gain we have to explicitly write a driver for that.

  1. As you told in cudaDebayerDemosaic example inside JetsonMultimedia API, I can suppose that it bypasses the ISP completely. But I’m experiencing auto exposure being done in the output stream. So Either this autoexposure is being done by the jetson Board ISP or it’s being implemented in software inside the libargus library (as some code is explicitly telling the sensor to increase exposure time and gain in darkness, either jetson board ISP or libargus library). I know I can stop/fix the autoexposure using AELock() or setExposureTimeRange() but can you please tell me a way to compeltely bypass the ISP i.e., I want pure raw images using libargus directly from the sensor, no ISP involved.
  2. There are some functions in libargus library e.g., FaceDetection. As these kind of functions are not the part of jetson hardware ISP, so they’re part of libargus library and they’ll run on CPU/GPU rather than the Hardware ISP. But functions line setIspDigitalGainRange just changes the gain value and the actual gain is applied using jetson board hardware ISP. So libargus in addition to the configuration of ISP, also has some functions that run completely on CPU/GPU. Can you please confirm this point or am I making some mistake in understanding?
  3. The camera app inside jetson multimedia API has functionalities like AE, gain range, AC region, AWB, denoise, edge enhancement etc. These are all ISP parameters. Libargus is just chaging the parameters while the actual edge enhancement, denoising etc. is done by the hardware ISP, right? or these functions are also implemented inside libargus just like Face Detection?

Argus don’t separate the sensor analog and digital gain.

  1. The AE/AWB actually is software implement in Argus not ISP HW logic. Suggest using v4l2 for bypass the ISP.
  2. The same with AE/AWB some feature implement by soft run on CPU or GPU.
  3. Some feature is software alg.

Thanks for the discussion and I have been following this.

@ShaneCCC are you saying that the complete application (present in the samples) located here jetson-multimedia-api/argus/apps/camera (attaching a link to the examples uploaded by someone on github but if you setup your board using Jetpack these are already available in the linux distribution) has all camera functions that are processed on the GPU/CPU using soft algorithms and not on the Jetson ISP?

This is surprising!
Because I always assumed that the sample camera application utilizes Jetson ISP for processing the stream and throwing it over to the EGL. That was why there was such a low latency…

I must say that LibArgus documentation does not point towards any clue of what’s happening behind the scenes. NVIDIA should at least include this in the documentation.

You should see this in docuemntation.
Some software like 3A alg that is implement in the camera core but most of others are ISP HW pipeline.

@ShaneCCC thanks for the image. I have seen this already. As you may see as well, the block diagram shows camera core being a central user space block that is connecting ISP as well as v4L2MediaController Framework.
Now what part of Jetson ISP LibArgus can control and what part is implemented as soft algorithms is completely unknown.

In short this information

Some software like 3A alg that is implement in the camera core but most of others are ISP HW pipeline.

Can be found no where in the documentation. I would be glad if you can point me towards such a place where I can get such information.

Sorry to tell current don’t have any document for public.

Thanks

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.