Hi @Mr_F, thanks again for the input data! The denoiser team was able to look at your data and reproduce the problem. There is a real bug here you’ve identified which is going to get fixed in an upcoming driver, however it may take a couple of months to fix and then percolate through our QA and release process.
In the mean time, you are still able to use tiling to work around the issue, right? Hopefully you can continue to do that, and that the tiling option is not terribly slow – we would expect that the overheads of tiling at the scale of 4k+ images will not be very large compared to untiled denoising (and hopefully the memory consumption is reduced as a side-benefit). Another option the team identified is you could use the OptiX 7 denoiser, and enable the AOV mode, which uses a different convolution kernel that is not subject to the issue you bumped into. If you have memory to spare, it sounds like with the OptiX 6 denoiser there is also an option to increase the memory limit high enough to trigger the denoiser to enable automatic tiling. That memory limit should be somewhere in the neighborhood of 200 MB.
If you were already thinking of moving to OptiX 7 at some point, maybe this is a decent excuse to try it. If that’s not reasonable at the moment, we are expecting the fix for this issue to appear in drivers numbered 495 and higher, a couple of months from now. Apologies for the bump in the road, and thank you kindly for reporting it and sharing repro data. Let me know if you have any questions.