How much time does it takes for nvidia ai accelerated denoiser(onnd) to denoise a 1080p image?

I’m currently working on this topic,while all the paper I have access to said onnd(optix 7.5 version) requires about 50ms-100ms to denoise a 1080p image.What i measured by myself turns out to be about 5 ms,that’s huge difference.I’m really confused.Thank you for your help.

The OptiX denoiser performance depends on the denoiser model used (HDR (outdated), AOV (recommended) or temporal AOV), the number of inputs, and especially the underlying GPU and the display driver version.
The newer the GPU architecture and higher-end the GPU, the faster is the denoiser. For example, GPUs with Tensor cores will be faster than older boards without. The denoiser is continuously improved so newer display driver versions are expected to either provide better results in image quality and/or performance.

I don’t have a per denoiser model per GPU performance table but low single-digit millisecond results for a 1920x1080 noisy input image are common on current GPUs.

If you found performance results, you always need to compare the denoiser model, GPU, and display driver versions as well if you achieve different results.