Trtexec can not convert resnet152 onnx to TRT engine, without prompting error!

Description

I convert the resnet152 model to onnx format, and tried to convert it to TRT engin file with trtexec. But I got the

Environment

TensorRT Version: 7.2.2.3
GPU Type: RTX 2060 Super / RTX 3070
Nvidia Driver Version: 457.51
CUDA Version: 10.2
CUDNN Version: 8.1.1.33
Operating System + Version: Windows 10
Python Version (if applicable): 3.6.12
PyTorch Version (if applicable): 1.7
Baremetal or Container (if container which image + tag):

Relevant Files

to_onnx.py
ResNet

Steps To Reproduce

  1. generate onnx file. to_onnx.py
python to_onnx.py 
  1. convert the onnx file to Tensor RT engin file:
trtexec --onnx=resnet_output_224.onnx --minShapes=input0:16x3x224x224  --optShapes=input0:16x3x224x224 --maxShapes=input0:16x3x224x224 --fp16 --workspace=5000 --saveEngine=resnet.bin --verbose
  1. trtexec exit without any warning / error prompted. The verbose log is attached: trtexec_log

Please give me a hand on it. Thank you.

Lanny

Hi,
Request you to share the ONNX model and the script if not shared already so that we can assist you better.
Alongside you can try few things:

  1. validating your model with the below snippet

check_model.py

import sys
import onnx
filename = yourONNXmodel
model = onnx.load(filename)
onnx.checker.check_model(model).
2) Try running your model with trtexec command.
https://github.com/NVIDIA/TensorRT/tree/master/samples/opensource/trtexec
In case you are still facing issue, request you to share the trtexec “”–verbose"" log for further debugging
Thanks!

Thank you for your quick response.
I tried the scripts you provided, and there is NO OUPUT for this check_model. Is it normal?

I did provided the trtexec log with --verbose enabled, please find it blow link
trtexec log

I also provide the resnet and onnx convert scripts below. You can simply click to check.
ResNet.py
to_onnx.py

Thank you in advanced.

Lanny

Hi @lannyyip1,

No output on check_model is normal if model is valid, Would you mind to share with us ONNX model to try from our end for better debugging.

Thank you.

The Onnx file is converted from Pytorch formal resnet152.
Please find it in the link below.
onnx file

Lanny

@spolisetty any update?

@lannyyip1,

Please allow us some time to work on this issue. Thank you.

Hello @lannyyip1,

We recommend you to please try on latest TensorRT 8.0 GA version. Please let us know if you still face this issue.
https://developer.nvidia.com/nvidia-tensorrt-8x-download

Thank you.

Hi, @spolisetty
I tried TensorRT 8.0 as you recommend. trtexec stops much earlier than in 7.2.2.3。
Here is the log:

07/06/2021-10:28:01] [I] TensorRT version: 7202
[07/06/2021-10:28:01] [V] [TRT] Registered plugin creator - ::GridAnchor_TRT version 1
[07/06/2021-10:28:01] [V] [TRT] Registered plugin creator - ::NMS_TRT version 1
[07/06/2021-10:28:01] [V] [TRT] Registered plugin creator - ::Reorg_TRT version 1
[07/06/2021-10:28:01] [V] [TRT] Registered plugin creator - ::Region_TRT version 1
[07/06/2021-10:28:01] [V] [TRT] Registered plugin creator - ::Clip_TRT version 1
[07/06/2021-10:28:01] [V] [TRT] Registered plugin creator - ::LReLU_TRT version 1
[07/06/2021-10:28:01] [V] [TRT] Registered plugin creator - ::PriorBox_TRT version 1
[07/06/2021-10:28:01] [V] [TRT] Registered plugin creator - ::Normalize_TRT version 1
[07/06/2021-10:28:01] [V] [TRT] Registered plugin creator - ::RPROI_TRT version 1
[07/06/2021-10:28:01] [V] [TRT] Registered plugin creator - ::BatchedNMS_TRT version 1
[07/06/2021-10:28:01] [V] [TRT] Registered plugin creator - ::BatchedNMSDynamic_TRT version 1
[07/06/2021-10:28:01] [V] [TRT] Registered plugin creator - ::FlattenConcat_TRT version 1
[07/06/2021-10:28:01] [V] [TRT] Registered plugin creator - ::CropAndResize version 1
[07/06/2021-10:28:01] [V] [TRT] Registered plugin creator - ::DetectionLayer_TRT version 1
[07/06/2021-10:28:01] [V] [TRT] Registered plugin creator - ::Proposal version 1
[07/06/2021-10:28:01] [V] [TRT] Registered plugin creator - ::ProposalLayer_TRT version 1
[07/06/2021-10:28:01] [V] [TRT] Registered plugin creator - ::PyramidROIAlign_TRT version 1
[07/06/2021-10:28:01] [V] [TRT] Registered plugin creator - ::ResizeNearest_TRT version 1
[07/06/2021-10:28:01] [V] [TRT] Registered plugin creator - ::Split version 1
[07/06/2021-10:28:01] [V] [TRT] Registered plugin creator - ::SpecialSlice_TRT version 1
[07/06/2021-10:28:01] [V] [TRT] Registered plugin creator - ::InstanceNormalization_TRT version 1

F:\win_lib\TensorRT-8.0.1.6\bin>

Please help me on this. Thank you.

Lanny

Thank you for the confirmation @lannyyip,
Please allow us some time to work on this.

Thank you. Sorry to bother you. Is there any update? I tried to convert ResNet18, and got the same result. trtexec termiated without any warning or error prompted.

Lanny.

Hi @lannyyip1,

Thank you for your patience. Looks like your environment is not setup correctly. We recommend you to please install latest TensorRT version 8.0.1 correctly by following installation guide.

Besides that, your model is not a dynamic onnx model. It is not allowed to setup shapes.
Please remove remove --minShapes=input0:16x3x224x224 --optShapes=input0:16x3x224x224 --maxShapes=input0:16x3x224x224

Just run trtexec --onnx=resnet_output_224.onnx --fp16 --workspace=5000 --saveEngine=resnet.bin --verbose

Thank you.

Hi, @spolisetty
Thank you for your information. With TensorRT 8, I found that it is complaining the squeeze operation in the model which is conflict (or not compatible with ) the dynamic dimensions asked in ONNX. So I remove the squeeze operation in the model, and found that it is OK to convert the model to TRT even with dynamic dimensions.
It is found that it’s not limited to TensorRT 8, TensorRT 7.2.2 can also convert the model with the fix mentioned above.
Thank you very much for your clue in fixing this problem!

Lanny

1 Like