Failed to run forward inference with TensorRT


When I used the tensorRT accelerated Yolov5 model for forward reasoning, it did not work properly! the info as follows:

[12/01/2020-08:58:36] [E] [TRT] ../rtSafe/cuda/cudaConvolutionRunner.cpp (457) - Cudnn Error in execute: 3 (CUDNN_STATUS_BAD_PARAM) [12/01/2020-08:58:36] [E] [TRT] FAILED_EXECUTION: std::exception


TensorRT Version:
GPU Type: 2080ti *2
Nvidia Driver Version: 440.95
CUDA Version: 10.2
CUDNN Version: 7.6.5
Operating System + Version: centos7
Python Version (if applicable): 3.6
TensorFlow Version (if applicable): -
PyTorch Version (if applicable): -
Baremetal or Container (if container which image + tag):


how can I solve it ?please help me! thanks!

Hi @1965281904,
Can you please share your ONNX model with us?

cudnn version may be wrong, 8.0.2 is preferred