ONNX parser cannot assert the output is tensor ( Yolact model )


ONNX Parser cannot detect the output layers from onnx model.

While parsing node number 343 [Softmax -> “766”]:
— Begin node —
input: “763”
output: “766”
name: “Softmax_343”
op_type: “Softmax”
attribute {
name: “axis”
i: 2
type: INT
— End node —
ERROR: /home/adlink/TensorRT/parsers/onnx/ModelImporter.cpp:536 In function importModel:
[7] Assertion failed: _importer_ctx.tensors().at(output.name()).is_tensor()
[09/14/2020-09:31:38] [E] Failed to parse onnx file
[09/14/2020-09:31:38] [E] Parsing model failed
[09/14/2020-09:31:38] [E] Engine creation failed
[09/14/2020-09:31:38] [E] Engine set up failed
&&&& FAILED TensorRT.trtexec # /usr/src/tensorrt/bin/trtexec --onnx=yolact_op10.onnx --verbose


TensorRT Version: 7.1.3
GPU Type: GTX1660
Nvidia Driver Version: 450.57
CUDA Version: 11.0.221
CUDNN Version:
Operating System + Version: 18.04.4
Python Version (if applicable): 3.6.9
TensorFlow Version (if applicable):
PyTorch Version (if applicable): 1.6.0
Baremetal or Container (if container which image + tag): TensorRT container nvcr.io/nvidia/tensorrt:20.08-py3

Relevant Files

Model file

Steps To Reproduce

/usr/src/tensorrt/bin/trtexec --onnx=yolact_op10.onnx --verbose

Github repository of corresponding model, i changed the export to export to ONNX opset 10

Hi @hiro.nguyen,
I tried running your model and could reproduce the issue.
We are checking on this. Please stay tuned.

Hi @AakankshaS, I have updated some more info.

Hi, Any update so far, thank you

We are facing the same problem.Any updates @AakankshaS?