ONNX parse problem with TopK

Description

Hi,
I’m trying to convert a ssd onnx model to trt with onnx2trt exection file.
Because it has NonMaxSuppresion in the model, I made a plugin which inheritances IPluginV2DynamicExt to support dynamic shape.
After NonMaxSuppression it was abort at TopK layer and gives the message as below:

While parsing node number 498 [TopK -> “TopK_717”]:
ERROR: /home/u5393118/TensorRT/parsers/onnx/builtin_op_importers.cpp:3283 In function importTopK:
[8] Assertion failed: inputs.at(1).is_weights()

I’m not sure if modify TopK op will solve this issue or I shouldn’t modify it since it was built in op.
Or other solution is recommended?

Thank you.

Environment

TensorRT Version : 7.0.0-1
GPU Type : Tesla V100
Nvidia Driver Version : 450.51.05
CUDA Version : 11.0
CUDNN Version :
Operating System + Version : ubuntu 18.04
Python Version (if applicable) : 3.6.9
TensorFlow Version (if applicable) :
PyTorch Version (if applicable) :
Baremetal or Container (if container which image + tag) :

Relevant Files

ONNX model is downloaded from https://github.com/onnx/models/tree/master/vision/object_detection_segmentation/ssd

NonMaxSuppressionPlugin.cpp (7.4 KB) CMakeLists.txt (732 Bytes) NonMaxSuppressionPlugin.h (3.7 KB) builtin_op_importers.cpp (165.4 KB)

Steps To Reproduce

  1. In plugin folder mkdir NonMaxSuppressionPlugin
  2. Add file CMakeList.txt and NonMaxSuppressionPlugin.cpp and NonMaxSuppressionPlugin.h
  3. Modify parsers/onnx/builtin_op_importers.cpp
  4. Add initializePlugin< nvinfer1::plugin::NonMaxSuppressionPluginCreator >(logger, libNamespace) in InferPlugin.cpp
  5. Add NonMaxSuppression in CMakeList.txt in TensorRT/Plugin
  6. Run make and make install in TensorRT/build/
  7. Rebuild libnvinfer.so and libnvonnxparser.so
  8. Copy .so file in 7 to /usr/lib/x86_64-linux-gnu
  9. Change to bin/ run onnx2trt ssd-10.onnx -o ssd.trt

Hi, Request you to share the ONNX model and the script so that we can assist you better.

Alongside you can try validating your model with the below snippet

check_model.py

import sys
import onnx
filename = yourONNXmodel
model = onnx.load(filename)
onnx.checker.check_model(model).

Alternatively, you can try running your model with trtexec command.

Thanks!

Thanks for reply,
I have tried the onnx checker and no error was reported.

The model file has error when I upload the file from my PC.
Here’s the download link:

Thank you,
Kevin.

Hi @disculus2012,

Please share Verbose log to debug.

Thank you.