Description
When the trt-model was in inference,I got the error like:
[E] [TRT] …/rtSafe/cuda/cudaConvolutionRunner.cpp (483) - Cudnn Error in executeConv: 3 (CUDNN_STATUS_BAD_PARAM)
[E] [TRT] FAILED_EXECUTION: std::exception
[E] [TRT] engine.cpp (1036) - Cuda Error in executeInternal: 700 (an illegal memory access was encountered)
Environment
TensorRT Version : 7.2.2.3
GPU Type : TITAN V
Nvidia Driver Version : 440.33.01
CUDA Version : 10.2
CUDNN Version : 8.0.2
Operating System + Version : ubuntu 16.04