[TensorRT] INTERNAL ERROR: Assertion failed: cublasStatus == CUBLAS_STATUS_SUCCESS ../rtSafe/cublas/cublasLtWrapper.cpp:279

Description

CenterNet ONNX to Tensorrt, low version is no problem but meet a bug in high version.

[TensorRT] INTERNAL ERROR: Assertion failed: cublasStatus == CUBLAS_STATUS_SUCCESS
…/rtSafe/cublas/cublasLtWrapper.cpp:279
Aborting…
[TensorRT] ERROR: …/rtSafe/cublas/cublasLtWrapper.cpp (279) - Assertion Error in getCublasLtHeuristic: 0 (cublasStatus == CUBLAS_STATUS_SUCCESS)
ERROR: failed to build the TensorRT engine!

Environment

TensorRT Version: 7.2.2.3
GPU Type: 1660Ti
Nvidia Driver Version: 455.23.05
CUDA Version: 11.1
CUDNN Version: 8.04
Operating System + Version: ubantu 18.04
Python Version (if applicable): 3.6
TensorFlow Version (if applicable):
PyTorch Version (if applicable): 1.7.1
Baremetal or Container (if container which image + tag):

Relevant Files

Hi @lcuknthing,

We are able to successfully convert ONNX model to trt, using trtexec command as following.
trtexec --onnx=20210324133230_fix_size.onnx --verbose --explicitBatch --shapes=input_image:1x3x448x448
I tried on trt version : 7.2.2.3

We request you to validate TensorRT installation steps you followed, for your reference please check installation guide.

or

We recommend you to try on TensorRT NGC container to avoid system dependencies.
https://ngc.nvidia.com/containers/nvidia:tensorrt

Thank you.