Description
ONNX Parser cannot detect the output layers from onnx model.
While parsing node number 343 [Softmax → “766”]:
— Begin node —
input: “763”
output: “766”
name: “Softmax_343”
op_type: “Softmax”
attribute {
name: “axis”
i: 2
type: INT
}
— End node —
ERROR: /home/adlink/TensorRT/parsers/onnx/ModelImporter.cpp:536 In function importModel:
[7] Assertion failed: _importer_ctx.tensors().at(output.name()).is_tensor()
[09/14/2020-09:31:38] [E] Failed to parse onnx file
[09/14/2020-09:31:38] [E] Parsing model failed
[09/14/2020-09:31:38] [E] Engine creation failed
[09/14/2020-09:31:38] [E] Engine set up failed
&&&& FAILED TensorRT.trtexec # /usr/src/tensorrt/bin/trtexec --onnx=yolact_op10.onnx --verbose
Environment
TensorRT Version: 7.1.3
GPU Type: GTX1660
Nvidia Driver Version: 450.57
CUDA Version: 11.0.221
CUDNN Version: 8.0.2.39
Operating System + Version: 18.04.4
Python Version (if applicable): 3.6.9
TensorFlow Version (if applicable):
PyTorch Version (if applicable): 1.6.0
Baremetal or Container (if container which image + tag): TensorRT container nvcr.io/nvidia/tensorrt:20.08-py3
Relevant Files
Model file
https://drive.google.com/file/d/1i4v5xwCdKSwt-dIhpYRjTcACy5tRHK5S/view?usp=sharing
Steps To Reproduce
/usr/src/tensorrt/bin/trtexec --onnx=yolact_op10.onnx --verbose
Github repository of corresponding model, i changed the export to export to ONNX opset 10