Tensorrt test time is not stable

Hello, when I use c++ to infer the engine file, when the nvidia driver version is a new version, the inference time is not stable(16-200ms)
But inference time is stable(16ms) when nvidia driver version is old. For example, nvGameS.dll is 27
I also tested different cuda version and trt version, I want to ask what is the reason for this, thanks for your answer.

Hello @827929990, welcome to the NVIDIA developer forums!

Since your question seems TensorRT specific, I took the liberty of moving the topic to the corresponding forum category.

I hope that is fine with you!

Best of success with your project!

Hi,

Could you please give us more details about platform,

Environment

TensorRT Version:
GPU Type:
Nvidia Driver Version:
CUDA Version:
CUDNN Version:
Operating System + Version:
Python Version (if applicable):
TensorFlow Version (if applicable):
PyTorch Version (if applicable):
Baremetal or Container (if container which image + tag):

Thank you.