TensorRT Parsing ONNX Model Error

Description

In my C++ code, I am calling TensorRT-8.0 api to load and parse an ONNX model. It ran into an error. The gist of this error is stated by

“Invalid Node - model/flatten_3/Reshape
Attribute not found: allowzero”

Exactly the same error message as this post: Error when transform large onnx model to trt - githubmemory

Does anyone have any idea how this was resolved in the above post? Thanks!

Environment

TensorRT 8.0:
**GPU GeForce **:
Nvidia Driver 465.89:
CUDA 11.3:
CUDNN 8.2.1:
Windows 10:

Relevant Files

Steps To Reproduce

Hi,
Request you to share the ONNX model and the script if not shared already so that we can assist you better.
Alongside you can try few things:

  1. validating your model with the below snippet

check_model.py

import sys
import onnx
filename = yourONNXmodel
model = onnx.load(filename)
onnx.checker.check_model(model).
2) Try running your model with trtexec command.
https://github.com/NVIDIA/TensorRT/tree/master/samples/opensource/trtexec
In case you are still facing issue, request you to share the trtexec “”–verbose"" log for further debugging
Thanks!

Hi NVES,

Thanks very much.

  1. I ran check_model.py snippet on my ONNX model file. It ran and finished quietly. I assume this means it is validated successfully. The full code in my check_model.py is as below.
    import sys
    import onnx
    filename = r"C:\mypath\mymodel.onnx"
    model = onnx.load(filename)
    onnx.checker.check_model(model, full_check=True)

  2. As to the full log, could I work with Nvidia directly, instead of sharing such information on the public forums?

Ok, problem resolved.

Hello, I have the same problem.

please explain how did you resolve it.

1 Like

@Gudbach
can you explain ?

I got the same error, for me it was because my ONNX model was using opset 14 (which is not currently supported by tensorRT), I lowered it to opset 13 and it worked

1 Like

Hello, i have the same error. @NVES @Gudbach could you please share the solution?

Literally the worst type of person. Why would you say you solved it, but not provide an answer. Ugh

1 Like