I use OPTIX_DENOISER_MODEL_KIND_HDR to denoise OPTIX_PIXEL_FORMAT_FLOAT4 buffer.
Generally results looks fine. But if source image has too high dynamic range, then there is huge halo around light sources in strait view and around any glossy reflections.
How to beat these artifacts?
As I can see OptiX denoiser internally perform image-wide reduce to calculate mean intensity:
Can one add yet another single-float :-) buffer to allow OptiX to perform image max(intensity) reduction to rescale input range to range for which pretrained OPTIX_DENOISER_MODEL_KIND_HDR model maden?
Should I clamp intensity all over the input image to some predefined range by myself?
If your situation allows, you might consider applying all of your exposure & color correction before denoising, so that your image to denoise is low dynamic range. In some sense that’s equivalent to clamping HDR intensity. But I realize this might not be possible in your workflow.
I don’t know how to avoid the artifacts when denoising HDR, I am asking the denoiser engineers, and I will update with their response. In the mean time I’m guessing (speculating wildly!) that you could also try modifying the intensity value that optixDenoiserComputeIntensity() computes - divide by some factor of your choosing.
Long time no hear! I almost finished porting my code to OptiX 7 (I must admit it was quite a rewrite…), and denoiser is one of the last steps. I think this thread is matching my question about the intensity value.
The name OptixDenoiserParams::hdrIntensity suggests it is relevant for the OPTIX_DENOISER_MODEL_KIND_HDR. However, it seems that the intensity is used also with the OPTIX_DENOISER_MODEL_KIND_LDR model. If I set intensity to 0, then output images are filled with extreme values. If I set intensity to 0.5 or 1.0 or use optixDenoiserComputeIntensity then images are fine, and look the same regardless of the value used.
What is the hdrIntensity meaning for the LDR model?
I have also checked value calculated by optixDenoiserComputeIntensity on my images clamped to the range 0.0-1.0. I get values about 4.5. Is that reasonable? (could be, if eg. it is 1/intensity, but better ask…).
The OptiX Programming Guide will be updated to explain the hdrIntensity calculation in more detail.
It is affecting the LDR denoiser as well in more recent versions, because that isn’t a separate kernel anymore, which means it’s fine to always calculate the hdrIntensity.