It looks like one of the new features in optix 5.1 is its ability to denoise HDR images, which is a very welcome addition. I assume this means you would pass in a linear HDR image and get back a denoised version of the HDR image (in linear space).
If this is the case, I have 2 questions:
- The documentation in the Optix programming guide say that the values for the input_buffer must be between 0 and 1 and should be gamma encoded. Is this an oversight?
- It seems like we’d need some way of telling the denoiser whether the input is a linear HDR image vs. a gamma corrected LDR image so it could select the appropriate training data. Am I missing something here?