OptiX 7 denoiser halo around bright objects

I use OPTIX_DENOISER_MODEL_KIND_HDR to denoise OPTIX_PIXEL_FORMAT_FLOAT4 buffer.
Generally results looks fine. But if source image has too high dynamic range, then there is huge halo around light sources in strait view and around any glossy reflections.
How to beat these artifacts?
As I can see OptiX denoiser internally perform image-wide reduce to calculate mean intensity:

CUDA_CHECK(cudaMalloc(reinterpret_cast<void * *>(&intensity), sizeof(float)));
OPTIX_CHECK(optixDenoiserComputeIntensity(denoiser, stream, &inputLayer, intensity,
                                          scratch, sizes.recommendedScratchSizeInBytes));
OptixDenoiserParams params = {};
params.denoiseAlpha = 1;
params.hdrIntensity = intensity;

OPTIX_CHECK(optixDenoiserInvoke(denoiser, stream, &params,
                                denoiserData, sizes.stateSizeInBytes,
                                &inputLayer, 2,
                                0, 0,
                                &outputLayer,
                                scratch, sizes.recommendedScratchSizeInBytes));

Can one add yet another single-float :-) buffer to allow OptiX to perform image max(intensity) reduction to rescale input range to range for which pretrained OPTIX_DENOISER_MODEL_KIND_HDR model maden?

Should I clamp intensity all over the input image to some predefined range by myself?

Hi tomilovanatoliy, welcome!

If your situation allows, you might consider applying all of your exposure & color correction before denoising, so that your image to denoise is low dynamic range. In some sense that’s equivalent to clamping HDR intensity. But I realize this might not be possible in your workflow.

I don’t know how to avoid the artifacts when denoising HDR, I am asking the denoiser engineers, and I will update with their response. In the mean time I’m guessing (speculating wildly!) that you could also try modifying the intensity value that optixDenoiserComputeIntensity() computes - divide by some factor of your choosing.


David.

Thank you for the reply. Seems I have to add dependency on CUDA runtime. Or use nppiThreshold from nppi.

The HDR denoiser handles values in the range of [0, 10000]. If you have higher values than that, it’s recommended to clamp them.
https://raytracing-docs.nvidia.com/optix7/guide/index.html#ai_denoiser#nvidia-ai-denoiser

The colored fringes around bright areas are a typical artifact when applying HDR images to the LDR denoiser as well. They should also happen when exceeding the HDR range.
https://raytracing-docs.nvidia.com/optix6/guide_6_5/index.html#post_processing_framework#6116

The OptixDenoiserParams::hdrIntensity helps more with the denoising of rather dark images.

Long time no hear! I almost finished porting my code to OptiX 7 (I must admit it was quite a rewrite…), and denoiser is one of the last steps. I think this thread is matching my question about the intensity value.

The name OptixDenoiserParams::hdrIntensity suggests it is relevant for the OPTIX_DENOISER_MODEL_KIND_HDR. However, it seems that the intensity is used also with the OPTIX_DENOISER_MODEL_KIND_LDR model. If I set intensity to 0, then output images are filled with extreme values. If I set intensity to 0.5 or 1.0 or use optixDenoiserComputeIntensity then images are fine, and look the same regardless of the value used.

What is the hdrIntensity meaning for the LDR model?

I have also checked value calculated by optixDenoiserComputeIntensity on my images clamped to the range 0.0-1.0. I get values about 4.5. Is that reasonable? (could be, if eg. it is 1/intensity, but better ask…).

Thanks!

The OptiX Programming Guide will be updated to explain the hdrIntensity calculation in more detail.

It is affecting the LDR denoiser as well in more recent versions, because that isn’t a separate kernel anymore, which means it’s fine to always calculate the hdrIntensity.

That is great to hear, thank you.