TRT]: INVALID_CONFIG: The engine plan file is generated on an incompatible device, expecting compute 8.6 got compute 7.5, please rebuild

Description

Hi I had a problem deserializing the TensorRT engine built in RTX 3070 and running on 2080 SUPER.

The error was as follows:
TRT]: INVALID_CONFIG: The engine plan file is generated on an incompatible device, expecting compute 8.6 got compute 7.5, please rebuild.

Environment

Except for the gpu, everything including tensorrt, cuda version were same thanks to using same docker container
TensorRT Version: 7.2.2.1+cuda11.1
GPU Type: Deserialized : RTX 2080 SUPER / Serialized : RTX 3070
Nvidia Driver Version: 465.19.01
CUDA Version: 11.1
CUDNN Version: 8.0.5
Operating System + Version: Ubuntu 20.04
Python Version (if applicable): python 3.6.9
TensorFlow Version (if applicable): x
PyTorch Version (if applicable): 1.8.0+cu111
Baremetal or Container (if container which image + tag):

My question is, is it possible to serialize the engine file in for example compute compatability 8.6 and deserialize it in for example compute compatability 7.5 which is less than serialized environment?

My guess is that it could be possible if I build the engine file in higher compute compatability, it could deserialize it in lower compute compatability by giving specific compute arch, sm code and build.
For example, cmake … -DGPU_ARHCS=“75”

Thanks in advance,
Joe Jang

Relevant Files

Please attach or include links to any models, data, files, or scripts necessary to reproduce your issue. (Github repo, Google Drive, Dropbox, etc.)

Steps To Reproduce

Please include:

  • Exact steps/commands to build your repro
  • Exact steps/commands to run your repro
  • Full traceback of errors encountered

Hi @user30739,

Engines created by TensorRT are specific to both the TensorRT version with which they were created and the GPU on which they were created.
I would suggest you to rebuild the engine on the system you are planning to run inference.

Thanks