TensorRT 8 convert UNET ERROR

Description

Hi,
I followed to Blog: https://developer.nvidia.com/blog/speeding-up-deep-learning-inference-using-tensorrt/
But in step: Import the ONNX model into TensorRT, generate the engine, and perform inference, I got error:
[08/05/2021-10:43:41] [E] [TRT] 2: [ltWrapper.cpp::setupHeuristic::327] Error Code 2: Internal Error (Assertion cublasStatus == CUBLAS_STATUS_SUCCESS failed.)
[08/05/2021-10:43:41] [E] [TRT] 2: [builder.cpp::buildSerializedNetwork::417] Error Code 2: Internal Error (Assertion enginePtr != nullptr failed.)

I have checked TensorRT installationed and it fine.

Environment

TensorRT Version: 8.0.1
GPU Type: V100
Nvidia Driver Version:
CUDA Version: 10.2
CUDNN Version: 7
Operating System + Version: 18.04
Python Version (if applicable): 3.6.9
PyTorch Version (if applicable):1.9.0+cu102

Hi,
Request you to share the ONNX model and the script if not shared already so that we can assist you better.
Alongside you can try few things:

  1. validating your model with the below snippet

check_model.py

import sys
import onnx
filename = yourONNXmodel
model = onnx.load(filename)
onnx.checker.check_model(model).
2) Try running your model with trtexec command.
https://github.com/NVIDIA/TensorRT/tree/master/samples/opensource/trtexec
In case you are still facing issue, request you to share the trtexec “”–verbose"" log for further debugging
Thanks!

I just attached Unet Onnx model , the simpleOnnx.cpp and log file when using trtexec “–verbose”" log.
Still got the same error when using trtexec .

Thanks
[ Unet Onnx model ] unet.onnx - Google Drive
log.txt (4.5 KB)
simpleOnnx.cpp (10.2 KB)

Hi @manhtb310,

We are unable to reproduce this issue on v100 gpu. We could successfully build tensorrt engine. We recommend you to please make sure you’ve installed dependencies and TensorRT correctly. If you still face this issue, please provide us trtexec command and complete verbose logs.

Thank you.

I got the same error. The ONNX model is from onnx model zoo:

curl -O https://media.githubusercontent.com/media/onnx/models/master/vision/classification/resnet/model/resnet18-v1-7.onnx

./trtexec --onnx=resnet18-v1-7.onnx

Docker image: tensorrt-ubuntu18.04-cuda10.2:latest
I tried build from source for both master branch and 8.0.1,

[09/05/2021-18:33:01] [E] Error[2]: [ltWrapper.cpp::setupHeuristic::327] Error Code 2: Internal Error (Assertion cublasStatus == CUBLAS_STATUS_SUCCESS failed.)
[09/05/2021-18:33:01] [E] Error[2]: [builder.cpp::buildSerializedNetwork::417] Error Code 2: Internal Error (Assertion enginePtr != nullptr failed.)
Segmentation fault (core dumped)

using cuda 11.3 seems work:

./docker/build.sh --file docker/ubuntu-18.04.Dockerfile --tag tensorrt-ubuntu18.04-cuda11.3 --cuda 11.3.1