Convert Mask-RCNN in PyTorch to TensorRT (SplitToSequence)

Description

Hi, I tried to convert Mask-RCNN from PyTorch to TensorRT engine but I got an error.

Environment

TensorRT Version:
GPU Type: Quadro RTX 6000
Nvidia Driver Version: 460.27.04
CUDA Version: 10.1
CUDNN Version:
Operating System + Version: Ubuntu 18.04
Python Version (if applicable): 3.6.9
TensorFlow Version (if applicable):
PyTorch Version (if applicable): 1.6.0
Baremetal or Container (if container which image + tag): nvcr.io/nvidia/pytorch:20.12-py3

Relevant Files

https://drive.google.com/file/d/1IqY2-YXFg88rjsPc4oWvmEiRLbK5koJn/view?usp=sharing
onnx_model

Steps To Reproduce

  1. First, I implemented MaskRCNN from PyTorch library and converted it to ONNX format with attached script (in my environment).

  2. Then, when I was trying convert it to TensorRT in NVIDIA docker I met this error, so I run from terminal:

$ polygraphy surgeon sanitize model.onnx --fold-constants --output model_folded.onnx

  1. In the end, when I was trying convert folded model I got this error:

[I] [TRT] /workspace/TensorRT/parsers/onnx/ModelImporter.cpp:139: No importer registered for op: SplitToSequence. Attempting to import as plugin.
[I] [TRT] /workspace/TensorRT/parsers/onnx/builtin_op_importers.cpp:3716: Searching for plugin: SplitToSequence, plugin_version: 1, plugin_namespace:
[E] [TRT] INVALID_ARGUMENT: getPluginCreator could not find plugin SplitToSequence version 1
While parsing node number 339 [SplitToSequence]:
ERROR: /workspace/TensorRT/parsers/onnx/builtin_op_importers.cpp:3718 In function importFallbackPluginImporter:
[8] Assertion failed: creator && “Plugin not found, are the plugin name, version, and namespace correct?”

I used this command (in NVIDIA docker) to convert the model:

$ trtexec --onnx=model_folded.onnx --explicitBatch --fp16 --workspace=64 --minShapes=input:1x3x720x1280 --optShapes=input:1x3x720x1280 --maxShapes=input:1x3x720x1280 --buildOnly --saveEngine=mask.engine

Will this plugin be supported in the near future? Or did I something wrong while converting?
Thanks

Hi,
Request you to share the ONNX model and the script if not shared already so that we can assist you better.
Alongside you can try few things:

  1. validating your model with the below snippet

check_model.py

import sys
import onnx
filename = yourONNXmodel
model = onnx.load(filename)
onnx.checker.check_model(model).
2) Try running your model with trtexec command.

In case you are still facing issue, request you to share the trtexec “”–verbose"" log for further debugging
Thanks!

Thanks for your reply! I added onnx models - basic and folded. I also checked onnx models with onnx checker, but it didn’t give me any error. When I run trtexec command with -verbose I got following error:

[V] [TRT] /workspace/TensorRT/parsers/onnx/ModelImporter.cpp:107: Parsing node: SplitToSequence_1141 [SplitToSequence]
[V] [TRT] /workspace/TensorRT/parsers/onnx/ModelImporter.cpp:123: Searching for input: 2396
[V] [TRT] /workspace/TensorRT/parsers/onnx/ModelImporter.cpp:123: Searching for input: 2422
[V] [TRT] /workspace/TensorRT/parsers/onnx/ModelImporter.cpp:129: SplitToSequence_1141 [SplitToSequence] inputs: [2396 → (1, -1)], [2422 → (5)],
[I] [TRT] /workspace/TensorRT/parsers/onnx/ModelImporter.cpp:139: No importer registered for op: SplitToSequence. Attempting to import as plugin.
[I] [TRT] /workspace/TensorRT/parsers/onnx/builtin_op_importers.cpp:3716: Searching for plugin: SplitToSequence, plugin_version: 1, plugin_namespace:
[E] [TRT] INVALID_ARGUMENT: getPluginCreator could not find plugin SplitToSequence version 1
While parsing node number 339 [SplitToSequence → “2423”]:
— Begin node —
input: “2396”
input: “2422”
output: “2423”
name: “SplitToSequence_1141”
op_type: “SplitToSequence”
attribute {
name: “axis”
i: 1
type: INT
}

— End node —
ERROR: /workspace/TensorRT/parsers/onnx/builtin_op_importers.cpp:3718 In function importFallbackPluginImporter:
[8] Assertion failed: creator && “Plugin not found, are the plugin name, version, and namespace correct?”

Hi @aleksandra.osztynowicz1,

ONNX parser of TensorRT doesn’t support SplitToSequence operator yet. We recommend you to implement custom plugin for this.
For your reference,
https://docs.nvidia.com/deeplearning/tensorrt/developer-guide/index.html#add_custom_layer_python

you can check supported operators here onnx-tensorrt/operators.md at master · onnx/onnx-tensorrt · GitHub

Thank you.