HDR denoising questions

It looks like one of the new features in optix 5.1 is its ability to denoise HDR images, which is a very welcome addition. I assume this means you would pass in a linear HDR image and get back a denoised version of the HDR image (in linear space).

If this is the case, I have 2 questions:

  1. The documentation in the Optix programming guide say that the values for the input_buffer must be between 0 and 1 and should be gamma encoded. Is this an oversight?
  2. It seems like we’d need some way of telling the denoiser whether the input is a linear HDR image vs. a gamma corrected LDR image so it could select the appropriate training data. Am I missing something here?


The input buffer for the LDR denoiser should be gamma corrected and better should not contain values higher than 10 which might still be handled gracefully, but even higher would result in incorrect results which appear as colored corruptions around HDR areas. (Mentioned in the online docs about the HDR mode.)

The new HDR mode in OptiX 5.1.0 can be enabled with the unsigned int variable “hdr” added to the denoiser stage.

It’s documented inside the OptiX Programming Guide chapter 6.4.2 Denoiser variables

BTW, I recommend to use the online docs http://raytracing-docs.nvidia.com/optix/index.html which can be more up to date than the documentation inside the actual OptiX release. That is actually the case for that “hdr” variable.

In any case, if you look at the sticky post which links to the OptiX Advanced Samples on github.com
the optixIntro_09 and _10 show how to use the DL Denoiser in LDR and HDR mode.

Actually when using OptiX 5.1.0 simply prefer the HDR mode. I did not clamp the LDR values to the range [0, 1] on the input buffer at the end of the custom tonemaper launch, yet. That should better be added when using the 09 example with OptiX 5.1.0.

That’s super useful information in general. Thank you.