Hi,
Request you to share the ONNX model and the script if not shared already so that we can assist you better.
Alongside you can try few things:
validating your model with the below snippet
check_model.py
import sys
import onnx
filename = yourONNXmodel
model = onnx.load(filename)
onnx.checker.check_model(model).
2) Try running your model with trtexec command.
In case you are still facing issue, request you to share the trtexec “”–verbose"" log for further debugging
Thanks!
I am able to calculate the number of parameters for my ONNX model. How about for the TensorRT model? I checked all the flags for trtexec but it seems it’s not supported. I don’t think the engine builder API does help either unless it has been updated relating to this issue.
I think it’s kind of basic knowledge whether TensorRT optimizations change the amount of parameters or not and I am missing that basic knowledge. That’s why I’m here :)
My ONNX model as an attachment: yolov5s.onnx. It has 7.2M parameters. I have downloaded it from yolov5 github repo.
It doesn’t have exact info amount number of parameters. A lot of info about the layers though but it’s a lot of work to calculate the number of params from the layers manually. Is there a tutorial somewhere on how to interpret the output? I’m not exactly sure how many parameters Conv_304[Half(1,512,20,20)] does have for example even though I know the size of its output and the accuracy of the elements.