TensorRT 7 not supported switch of GPU devices


TensorRT 7 not supported switch of GPU devices. I am currently accelerating the TensorRT based on the Yolov5 model, and then I can run through the 0 card on my server normally. However, when I switch to the 1 card, I will encounter the following problems:
[09/29/2020-10:57:03] [E] [TRT] …/rtSafe/safeContext.cpp (133) - Cudnn Error in configure: 7 (CUDNN_STATUS_MAPPING_ERROR)
[09/29/2020-10:57:03] [E] [TRT] FAILED_EXECUTION: std::exception


TensorRT Version:
GPU Type: 2080ti
Nvidia Driver Version: 418.67
CUDA Version: 10
CUDNN Version: 7.6.5
Operating System + Version: centos7
Python Version (if applicable): 3.6
TensorFlow Version (if applicable):
PyTorch Version (if applicable):
Baremetal or Container (if container which image + tag):

Relevant Files

Please attach or include links to any models, data, files, or scripts necessary to reproduce your issue. (Github repo, Google Drive, Dropbox, etc.)

Steps To Reproduce

Please include:

  • Exact steps/commands to build your repro
  • Exact steps/commands to run your repro
  • Full traceback of errors encountered

Hi @1965281904,
Can you please try driver 440.33 and 10.2 cuda installation.


Hi @ AakankshaS

Since the server I am currently using is online, I cannot upgrade its driver. Is there any other method I can try ?

Hi @1965281904,
Can you please share your model for us to reproduce the issue?