A qustion about custom layer plugin when i use onnx parser

Description

 When I use addlayer to use tensorrt, I can plugin a custom layer through “ipluginv2” and “plugincreator” interfaces. When I use onnx parser to use tensorrt, can I only plugin a custom layer through “ipluginv2” and “pluginfactory” interfaces? 
If only “pluginfactory” can be used, the “createplugin” function of this class can only pass “× weights” and “nbweights”, how can I pass other parameters? 

Environment

TensorRT Version: tensorrt 5.1.4 in drive software 10
GPU Type: drive xavier
Nvidia Driver Version:
CUDA Version: 10.2
CUDNN Version: 7.6
Operating System + Version: ubuntu
Python Version (if applicable):
TensorFlow Version (if applicable):
PyTorch Version (if applicable):
Baremetal or Container (if container which image + tag):

Relevant Files

Please attach or include links to any models, data, files, or scripts necessary to reproduce your issue. (Github repo, Google Drive, Dropbox, etc.)

Steps To Reproduce

Please include:

  • Exact steps/commands to build your repro
  • Exact steps/commands to run your repro
  • Full traceback of errors encountered

Hi,
Please refer to below links related custom plugin implementation and sample:

Thanks!

Thanks for your reply.I followed the recommendations of the above document and set the return value of plugincreator’s getpluginname() function to be the same as the value of the layer in the onnx model, but still “error: parse node 0” failed during parse. But when I use getPluginCreatorList, I can see this plugin. What could be the reason?

Hi @lyw199441,

Could you please install latest trt 7.2.x
https://docs.nvidia.com/deeplearning/tensorrt/install-guide/index.html

and refer the following doc for adding custom layers.
https://docs.nvidia.com/deeplearning/tensorrt/developer-guide/index.html#add_custom_layer

Thank you.