Hi,
Request you to share the ONNX model and the script if not shared already so that we can assist you better.
Alongside you can try few things:
validating your model with the below snippet
check_model.py
import sys
import onnx
filename = yourONNXmodel
model = onnx.load(filename)
onnx.checker.check_model(model).
2) Try running your model with trtexec command. https://github.com/NVIDIA/TensorRT/tree/master/samples/opensource/trtexec
In case you are still facing issue, request you to share the trtexec “”–verbose"" log for further debugging
Thanks!
Thank you. Sorry to bother you. Is there any update? I tried to convert ResNet18, and got the same result. trtexec termiated without any warning or error prompted.
Thank you for your patience. Looks like your environment is not setup correctly. We recommend you to please install latest TensorRT version 8.0.1 correctly by following installation guide.
Besides that, your model is not a dynamic onnx model. It is not allowed to setup shapes.
Please remove remove --minShapes=input0:16x3x224x224 --optShapes=input0:16x3x224x224 --maxShapes=input0:16x3x224x224
Just run trtexec --onnx=resnet_output_224.onnx --fp16 --workspace=5000 --saveEngine=resnet.bin --verbose
Hi, @spolisetty
Thank you for your information. With TensorRT 8, I found that it is complaining the squeeze operation in the model which is conflict (or not compatible with ) the dynamic dimensions asked in ONNX. So I remove the squeeze operation in the model, and found that it is OK to convert the model to TRT even with dynamic dimensions.
It is found that it’s not limited to TensorRT 8, TensorRT 7.2.2 can also convert the model with the fix mentioned above.
Thank you very much for your clue in fixing this problem!