Denoising benchmarks with GeForce GTX 1080

Hi guys,
i wrote a program that uses OptiX AI Denoiser to denoise a single rendered image.
However it takes a long time to execute the denosing operation:

rtCommandListExecute(my_CommandListWithDenoiser);

This command takes about 1200 -1500 ms on a GeForce GTX 1080 with a resolution of 700x774 pixels. If anyone could share some benchmarks or has any suggestions how to speed up my excecution time i would be very thankful. I am using OptiX 5.1!

Thank you in advance, Jakob

You’re probably measuring the first time initialization overhead.

Please try optixIntro_10 of my OptiX Introduction examples which uses the HDR denoiser in OptiX 5.1.0:
[url]https://devtalk.nvidia.com/default/topic/998546/optix/optix-advanced-samples-on-github/[/url]

Hi, thank you for your answer!
I checked out your examples and measured the performance there as well. So the first iteration of denoising took very long as well (about 2000ms), but after that it gets much faster (20ms)! Problem solved!
Thank you very much for your help! Cheers, Jakob