I noticed that sometimes with some data OptiX denoiser may produce a pattern like this:
This is usually resolved by arbitrarily scaling image intensity before denoising. My approach is to tonemap the image to LDR, then use LDR denoiser. I also tried skipping the tonemap and using HDR denoising, but results are similar. The issue is especially prominent when input data has very high intensities, but sometimes also visible on images with not-so-high dynamic range.
Some interesting bits of information:
Pretty sure I didn’t encounter it before, and I updated my driver recently. Went from 418.81 to 457.30. Not sure if related. Maybe I didn’t notice it before, although I used the denoiser for a few years.
Happens with both OptiX 6.0 and OptiX 7.2. However, doesn’t happen with OptiX 5.1.
I’m on GTX 1060. Same problem was previously reported to me from GTX 20XX users.
In fact the images I denoise are lightmaps. I wonder if it doesn’t play with the (latest?) training dataset? It used to work pretty well.
Images are originally stored as half3 and are converted to float3 before denoising.
Here is a less extreme case, but still visible: