Error[4]: [graphShapeAnalyzer.cpp::analyzeShapes::1285] Error Code 4: Miscellaneous

I convert onnx model to engine model but I got this error.

[08/10/2023-06:31:48] [W] [TRT] onnx2trt_utils.cpp:366: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32.
[08/10/2023-06:31:51] [W] [TRT] onnx2trt_utils.cpp:392: One or more weights outside the range of INT32 was clamped
[08/10/2023-06:31:51] [E] Error[4]: [graphShapeAnalyzer.cpp::analyzeShapes::1285] Error Code 4: Miscellaneous (IElementWiseLayer Mul_1384: broadcast dimensions must be conformable)
[08/10/2023-06:31:51] [E] [TRT] ModelImporter.cpp:773: While parsing node number 1384 [Mul -> "2291"]:
[08/10/2023-06:31:51] [E] [TRT] ModelImporter.cpp:774: --- Begin node ---
[08/10/2023-06:31:51] [E] [TRT] ModelImporter.cpp:775: input: "2288"
input: "3095"
output: "2291"
name: "Mul_1384"
op_type: "Mul"

[08/10/2023-06:31:51] [E] [TRT] ModelImporter.cpp:776: --- End node ---
[08/10/2023-06:31:51] [E] [TRT] ModelImporter.cpp:779: ERROR: ModelImporter.cpp:179 In function parseGraph:
[6] Invalid Node - Mul_1384
[graphShapeAnalyzer.cpp::analyzeShapes::1285] Error Code 4: Miscellaneous (IElementWiseLayer Mul_1384: broadcast dimensions must be conformable)
[08/10/2023-06:31:51] [E] Failed to parse onnx file
[08/10/2023-06:31:51] [I] Finish parsing network model
[08/10/2023-06:31:51] [E] Parsing model failed
[08/10/2023-06:31:51] [E] Failed to create engine from model.
[08/10/2023-06:31:51] [E] Engine set up fail

How should I do to resolve that problem. I tried to export with implicit and explicit batch, the error was the same.

Hi,
Request you to share the ONNX model and the script if not shared already so that we can assist you better.
Alongside you can try few things:

  1. validating your model with the below snippet

check_model.py

import sys
import onnx
filename = yourONNXmodel
model = onnx.load(filename)
onnx.checker.check_model(model).
2) Try running your model with trtexec command.

In case you are still facing issue, request you to share the trtexec “”–verbose"" log for further debugging
Thanks!

@AakankshaS
Thanks for response.
I checked .onnx model and it is normal.

Hi,

We recommend that you please try the latest TensorRT version 8.6.1.
If you face the same issue, please share with us the complete verbose logs and, if possible, the ONNX model for better assistance.

Thank you.

@johnminho Hello! How you solve the problem? I have the same issue.