success = parser.parse_from_file(“MyModel.onnx”)
for idx in range(parser.num_errors):
print(parser.get_error(idx))
if not success:
print(“Parser from file failed”)
and I get the Error:
ERROR: builtin_op_importers.cpp:2651 In function importResize:
[8] Assertion failed: scales.is_weights() && “Resize scales must be an initializer!”
My question is how can I debug this issue?
I have a really big model
What is the best debugging process when I see an error like this?
Hi,
Request you to share the ONNX model and the script if not shared already so that we can assist you better.
Alongside you can try few things:
validating your model with the below snippet
check_model.py
import sys
import onnx
filename = yourONNXmodel
model = onnx.load(filename)
onnx.checker.check_model(model).
2) Try running your model with trtexec command. https://github.com/NVIDIA/TensorRT/tree/master/samples/opensource/trtexec
In case you are still facing issue, request you to share the trtexec “”–verbose"" log for further debugging
Thanks!
Trtexec gives me the same error
I cannot check the onnx file with the onnx checker because I wrote a plugin for an unsupported layer and when I exported The onnx I disabled the checker, I think the layer is good and the error comes from a different place
I can open the onnx file with netron but the model is very big, I just want to understand what is the debug process and how can I identify the problems, from my understanding I have a lot of problems with the model and I want to identify each and every one of the problems