Issue while converting from ONNX to TensorRT Engine

Hi,


I am Converting my Custom Pytorch Model to TRT Engine file. I am able to convert the model to Onnx, but I am facing an Issue while converting from Onnx to the Engine file. I am referring this blog for the Conversion(https://developer.nvidia.com/blog/speeding-up-deep-learning-inference-using-tensorflow-onnx-and-tensorrt/). I am getting the error as:
AttributeError: ‘NoneType’ object has no attribute ‘serialize’.
I am giving the correct shape as an input to the build engine function.
And for Resnet50 model I am able to convert sucessfully but for my custom model I am facing the above issue.
Please suggest if there is any specific reason behnd that.

Thanks

Hi @darshancganji12
Request you to share your onnx model with the verbose logs.
Alternatively, you can try running your onnx model with trtexec command.
https://github.com/NVIDIA/TensorRT/tree/master/samples/opensource/trtexec

Thanks!

Hi,
Thanks for your reply.
I tried with trtexec command, I am getting the error as attached below:

While parsing node number 195 [Resize]:
ERROR: ModelImporter.cpp:124 In function parseGraph:
[5] Assertion failed: ctx->tensors().count(inputName)
[09/25/2020-11:02:52] [E] Failed to parse onnx file
[09/25/2020-11:02:52] [E] Parsing model failed
[09/25/2020-11:02:52] [E] Engine creation failed
[09/25/2020-11:02:52] [E] Engine set up failed
&&&& FAILED TensorRT.trtexec # trtexec --onnx=last.onnx --batch=1 --saveEngine=last.engine

The command I used: trtexec --onnx=last.onnx --batch=1 --saveEngine=last.engine

Looking forward to the Reply

Thanks

Hi @darshancganji12,
Request you to share your onnx model.

Thanks!

Hi @AakankshaS,

Thanks for your reply. Here is the ONNX model https://1drv.ms/u/s!Apz7xuPhdGTbhMdWtcpq4GnJJ4aX_Q?e=0C4z7o.

I am facing the error as shown below:
[TensorRT] ERROR: Network must have at least one output
[TensorRT] ERROR: Network validation failed.

Thanks

Hi @darshancganji12,
I could not reproduce the issue with latest TRT release, can you please try that?

Thanks!