Tensorrt SM: 0x809 error

Description

I tried to deploy tensorrt on my machine for onnx to tensorrt, but kept getting the error “Unsupported SM: 0x809”, here is the full screenshot of the error:

I’ve tried several versions of tensorrt, including the cuda counterpart, but it hasn’t worked and I want to understand why.

I am coding on windows system and using vscode.

Environment

TensorRT Version: 8.5.1.7
GPU Type: 4060TI
Nvidia Driver Version: 536.23
CUDA Version: 11.5.2
CUDNN Version: 8.9.4.25
Operating System + Version: windows10
Python Version (if applicable): 3.8.9
TensorFlow Version (if applicable): none
PyTorch Version (if applicable): 1.12.1+cu113
Baremetal or Container (if container which image + tag): none

Relevant Files

Please attach or include links to any models, data, files, or scripts necessary to reproduce your issue. (Github repo, Google Drive, Dropbox, etc.)

Steps To Reproduce

Please include:

  • Exact steps/commands to build your repro
  • Exact steps/commands to run your repro
  • Full traceback of errors encountered

This is my test code :

Hi,

The 4060 Ti GPU has a CUDA compute capability of 8.9.
It requires CUDA Toolkit 11.8 or newer.
Please upgrade your CUDA version and try again. Please refer to the support document for the same. Click on 23

We also recommend you to use the latest TensorRT version 8.6.1

Thank you.

1 Like

Thank you!And I would like to ask if it is the common for different versions of Tensorrt to convert the same onnx model to engine? For example, Tensorrt8.2 and Tensorrt 8.6. Will the model I converted on version 8.2 work with version 8.6?

Please refer to the following document, which may help you:

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.