Same version TensorRT with two methods to convert onnx model,One used trtexec[FAILED] , the other used python[Success]

Description

Same version TensorRT(8.6.1) with two methods to convert same onnx model,One used trtexec[FAILED] , the other used python[Success]

Hi, I am trying to convert an Onnx model with dynamic inputs to TensorRT format. I used two methods to convert this model ,but I get error when using trtexec.

Environment

TensorRT Version: 8.6.1
GPU Type: NVIDIA GeForce RTX 2060
Nvidia Driver Version: 530.41.03
CUDA Version: 11.4
CUDNN Version: 8.6
Operating System + Version: ubuntu 18.04
Python Version (if applicable): 3.8
PyTorch Version (if applicable): 1.9

Use python to convert our model:
Here is the python code:

We generated the .trt file successfully but with warning:

onnx2trt.py:25: DeprecationWarning: Use build_serialized_network instead.
engine = builder.build_engine(network, config)
[09/07/2023-19:41:33] [TRT] [W] [RemoveDeadLayers] Input Tensor center_objects_type is unused or used only at compile-time, but is not being removed.
success generate trtfile

Use trtexec to convert our model:
ERROR log:

[09/07/2023-20:04:41] [V] [TRT] --------------- Timing Runner: {ForeignNode[Unsqueeze_11…Slice_2392]} (Myelin)
[09/07/2023-20:05:17] [W] [TRT] Skipping tactic 0 due to Myelin error: No results returned from cublas heuristic search
[09/07/2023-20:05:17] [V] [TRT] Fastest Tactic: -3360065831133338131 Time: inf
[09/07/2023-20:05:17] [V] [TRT] Deleting timing cache: 145 entries, 633 hits
[09/07/2023-20:05:17] [E] Error[10]: [optimizer.cpp::computeCosts::2011] Error Code 10: Internal Error (Could not find any implementation for node {ForeignNode[Unsqueeze_11…Slice_2392]}.)
[09/07/2023-20:05:17] [E] Error[2]: [builder.cpp::buildSerializedNetwork::609] Error Code 2: Internal Error (Assertion enginePtr != nullptr failed. )
[09/07/2023-20:05:17] [E] Engine could not be created from network
[09/07/2023-20:05:17] [E] Building engine failed
[09/07/2023-20:05:17] [E] Failed to create engine from model.
[09/07/2023-20:05:17] [E] Engine set up failed
&&&& FAILED TensorRT.trtexec [TensorRT v8205] # ./trtexec --onnx=/home/root/haomo/dpc/onnx/simplified_model.onnx --saveEngine=/home/root/haomo/dpc/onnx/test.trt --verbose --workspace=100000

How to fix the error with trtexec? I really appreciate the help.

Hi,
Request you to share the ONNX model and the script if not shared already so that we can assist you better.
Alongside you can try few things:

  1. validating your model with the below snippet

check_model.py

import sys
import onnx
filename = yourONNXmodel
model = onnx.load(filename)
onnx.checker.check_model(model).
2) Try running your model with trtexec command.

In case you are still facing issue, request you to share the trtexec “”–verbose"" log for further debugging
Thanks!

Hi,I have checked the onnx model and have tried trtexec to convert it , but FAILED.
Here is the ERROR log:

[09/07/2023-20:04:41] [V] [TRT] --------------- Timing Runner: {ForeignNode[Unsqueeze_11…Slice_2392]} (Myelin)
[09/07/2023-20:05:17] [W] [TRT] Skipping tactic 0 due to Myelin error: No results returned from cublas heuristic search
[09/07/2023-20:05:17] [V] [TRT] Fastest Tactic: -3360065831133338131 Time: inf
[09/07/2023-20:05:17] [V] [TRT] Deleting timing cache: 145 entries, 633 hits
[09/07/2023-20:05:17] [E] Error[10]: [optimizer.cpp::computeCosts::2011] Error Code 10: Internal Error (Could not find any implementation for node {ForeignNode[Unsqueeze_11…Slice_2392]}.)
[09/07/2023-20:05:17] [E] Error[2]: [builder.cpp::buildSerializedNetwork::609] Error Code 2: Internal Error (Assertion enginePtr != nullptr failed. )
[09/07/2023-20:05:17] [E] Engine could not be created from network
[09/07/2023-20:05:17] [E] Building engine failed
[09/07/2023-20:05:17] [E] Failed to create engine from model.
[09/07/2023-20:05:17] [E] Engine set up failed
&&&& FAILED TensorRT.trtexec [TensorRT v8205] # ./trtexec --onnx=/home/root/haomo/dpc/onnx/simplified_model.onnx --saveEngine=/home/root/haomo/dpc/onnx/test.trt --verbose --workspace=100000

Here is the onnx model, looking forwark to your reply.
simplified_model.onnx (15.1 MB)

Hi,

We recommend you use the latest TensorRT version 8.6.1.
We are able to build the TensorRT engine successfully.

[09/11/2023-11:49:18] [I]
&&&& PASSED TensorRT.trtexec [TensorRT v8601] # trtexec --onnx=model.onnx --verbose

Thank you.

That’s amazing, because the version I used is 8.6.1.