Error : tensorrt-7.2.2.3-cp38-none-linux_x86_64.whl is not a supported wheel on this platform

Description

tensorrt-7.2.2.3-cp38-none-linux_x86_64.whl is not a supported wheel on this platform.

Environment

TensorRT Version: tensorrt-7.2.2.3
GPU Type: RTX 3090
Nvidia Driver Version: 455.28
CUDA Version: 11.1 (Runtime 11.0.1)
CUDNN Version: v8.0.5.39
Operating System + Version: Ubuntu 18.04.5
Python Version (if applicable): 3.8
TensorFlow Version (if applicable):
PyTorch Version (if applicable):
Baremetal or Container (if container which image + tag):

Relevant Files

Please attach or include links to any models, data, files, or scripts necessary to reproduce your issue. (Github repo, Google Drive, Dropbox, etc.)

Steps To Reproduce

Please include:

  • Exact steps/commands to build your repro
  • Exact steps/commands to run your repro
  • Full traceback of errors encountered

Hi , We recommend you to check the supported features from the below link.
https://docs.nvidia.com/deeplearning/tensorrt/support-matrix/index.html

Thanks!

Hi, I did and couldn’t find what was wrong.

Thanks for the help!

Hi @matthieuvanhoutte,

Sorry for the delayed response. We request you to provide more details regarding issue.
And we recommend you to follow official installation guide for trt.
https://docs.nvidia.com/deeplearning/tensorrt/install-guide/index.html

you can also use tensorrt NGC container alternatively to avoid host side dependencies.
https://ngc.nvidia.com/containers/nvidia:tensorrt

Thank you.