Hi!

I am looking for some documentation resources on the “irradiance” output of the IRay renderer that could help me with a few questions I have:

- When you hover over a pixel to show its irradiance value it shows its unit as lux (lm/m²) and as such, it might actually rather be “illuminance” instead of “irradiance”, which would be measured in W/m². Is that correct?
- If I would grab the irradiance output as a texture, what data format would it be? E.g., is it one float value per pixel?
- If I would then want to calculate the illuminance of red, green, and blue pixel components separately, i.e., by weighting the pixel illuminance by the relative spectral distribution integrals of red, green, and blue, could I get access to these values with tonemapping disabled?
- Is there a way to put “irradiance probes”, which are mentioned in the IRay docs?

I should note that my actual goal is to get an estimate on the number of photons per second hitting a pixel. As irradiance relates to the photon count if you know the wavelengths, and illuminance somehow relates to irradiance, I am looking for a way to, e.g., convert the render output for a pixel to something like

- The red intensity value of the pixel equates to x W/m² over a wavelength range of 615-645 nm.
- The green intensity value of the pixel equates to y W/m² over a wavelength range of 500-580 nm.
- The blue intensity value of the pixel equates to z W/m² over a wavelength range of 440-490 nm.

(wavelength ranges are just examples)

Thank you for your help!

Cheers,

Arne

Hi @arne.jacobs. I’m checking with the iray devs.

1 Like

Hi Arne. Here’s what the devs have to say:

- The “irradiance” render target indeed by default reports illuminance (photometric units) rather than irradiance (radiometric units). We’re thinking of renaming that to avoid confusion.
- The result is always a colored (RGB) representation of the illuminance. Assuming linear sRGB, the actual (scalar) illuminance (unit: lux) can be computed as Y = r * 0.212656 + g * 0.715158 + b * 0.0721856.
- Generally, the RGB output of all color buffers in Iray is computed from spectral input using a weighted integral over a range of wavelengths (different weights per channel). Of course this only comes into play if spectral rendering is enabled. Per default iray uses the CIE XYZ color matching function (default 1931 version) and a transform to linear sRGB, but it is actually configurable and custom weights can be provided. There is also a second mode that offers an unweighted integral over a range of wavelengths (giving direct access to irradiance in W/m^2).
**None of that is exposed in Omniverse as far as I know.**
- Probes are not exposed unfortunately. Spectral rendering is exposed, at least partially.

Thank you very much for the information! This is already very helpful.

However, I have a follow-up-question:

It seems strange to me, that the scalar illuminancein lux should be computed by using these weights, if the separate r,g,b values are already given in lux. These weight probably correspond to the different perceived intensity by the human eye, but shouldn’t this already have been considered in the r,g,b lux values?

An example: a monochromatic red scene should result in non-zero lux values for the r component, and in close-to-zero values for the other two (g,b). If using the above formula for computing the combined scalar lux from the three r,g,b values, one would get a value much lower than the original lux based on only the r component. In such a monochromatically illuminated scene both should be basically equal or at least very similar, to my understanding.

Could you explain this relation a bit more, please?

Also, it would be nice to know the weights used to compute the r,g,b lux values for spectral rendering in Omniverse.

Thanks a lot!

Cheers,

Arne

Hey @arne.jacobs. Here’s the response from the devs:

R,G,B are not exactly “in lux”, the unit “lux” applies to the scalar scalar quantity illuminance. Since we output a color buffer (and color information may be useful, even for illuminance) we output a color which fulfills cie_luminance(color) = illuminance, where cie_luminance computes the CIE XYZ color space Y channel value (“luminance”) for the color. The computation is essentially matrix transform from linear sRGB to XYZ, so that’s where the weights come from. The relation of radiometric to photometric units is completely defined by the CIE Y color matching function, the X and Z components are not relevant for the units.

An example: a monochromatic red scene should result in non-zero lux values for the r component, and in close-to-zero values for the other two (g,b). If using the above formula for computing the combined scalar lux from the three r,g,b values, one would get a value much lower than the original lux based on only the r component. In such a monochromatically illuminated scene both should be basically equal or at least very similar, to my understanding.

Could you explain this relation a bit more, please?

The CIE Y value is different for saturated red, green, or blue. It is designed to be aligned with the human perception of the “luminance” of that color. This means, that a blue (which is perceived darkest) needs a higher value to be perceived at a certain luminance than a green (which is perceived brightest): (0,1,0) in linear sRGB has luminance 0.715158 while (0,0,1) only has luminance 0.0721856. While this may seem a bit extreme, one needs to consider that these are linear quantities and human perception is not linear.

Also, it would be nice to know the weights used to compute the r,g,b lux values for spectral rendering in Omniverse.

Iray internally computes the spectral radiance (for regular output) or spectral irradiance (for irradiance output) in radiometric units. Before the values are stored in the result buffer, we need to convert them from spectral to color space. The first step is the CIE XYZ color space. This is done as an integral over the whole visible spectrum of the spectral values weighted by the corresponding color matching function, i.e.

X = \int_{380nm}^{780nm} cmf_x(lambda) * value(lambda) dlambda

Y = \int_{380nm}^{780nm} cmf_y(lambda) * value(lambda) dlambda

Z = \int_{380nm}^{780nm} cmf_z(lambda) * value(lambda) dlambda

The CIE XYZ color matching functions are standardized and available on the net (e.g. Colour matching functions). They are typically specified such that cmf_y peaks at 1.0, in which case a scale of 683.002 needs to be applied, to correctly establish the relation of radiometric to photometric units (e.g. spectral irradiance in Watt/m^2/nm to illuminance in lux). The Y channel output then contains the photometric quantity in photometric units.

There are two flavors of color matching functions, the 1931 and the 1964 version. This is configurable in Omniverse, the default is 1931 which strictly speaking is also the one defining the relation to photometric units.

The CIE XYZ result is then converted to linear sRGB via simple matrix transform and this is value that is output in the buffer (it appears that this transform is currently not configurable in Omniverse).

Thank you very much for the elaborate answer.

So ideally I would want to use the spectral data before weighting with the color matching functions, but as I understood it, this is not exposed in Omniverse currently. Having the illuminance and corresponding color already helps.

Did I understand this correctly, that the “irradiance” render output buffer is actually already the result of conversion to sRGB? I wonder, because the viewport window seems to show some false color representation of a scalar. I thought it might be the illuminance or the Y channel, respectively.

If the “irradiance” buffer actually contains sRGB data, what is then the difference between that and the “normal” color render output?

Sadly I did not find any documentation of the precise format and data type of the irradiance buffer.

Sorry for having to ask again, and thank you for your help!

Regards,

Arne