No importer registered for op: CumSum

Description

I’m trying to convert a simple ONNX model to TensorRT using trtexec without success as it is missing the CumSum plugin.

Here are some of the output messages:

[W] [TRT] onnx2trt_utils.cpp:220: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32.
[E] [TRT] INVALID_ARGUMENT: getPluginCreator could not find plugin CumSum version 1
ERROR: builtin_op_importers.cpp:3773 In function importFallbackPluginImporter:
[8] Assertion failed: creator && “Plugin not found, are the plugin name, version, and namespace correct?”
[E] Failed to parse onnx file
[E] Parsing model failed
[E] Engine creation failed
[E] Engine set up failed

Environment

TensorRT Version: 7.2.2.3
GPU Type: GTX 1060
Nvidia Driver Version: 460.39
CUDA Version: 11.1
CUDNN Version: 8.0.5
Operating System + Version: Ubuntu 18.04
Python Version (if applicable): 3.8.8
TensorFlow Version (if applicable):
PyTorch Version (if applicable):
Baremetal or Container (if container which image + tag):

Relevant Files

The model and log file can be found in: Onnx - Google Drive

Steps To Reproduce

trtexec --onnx=test.onnx

Hi @lchewhun ,
You need to implement custom plugin to have your op supported.
Please refer to the below example.

Thanks!