Can anyone provide some examples about how to define custom layers for onnx model?

TensorRT version: 5.0.2.6

Question: How to define custom layers for onnx model? Any example will be helpful. thx!

Details:

I notice that NvOnnxParserRuntime.h provides the API IPluginFactory* createPluginFactory for onnx custom layers, however, I cannot find any examples about how to use this API. By the way, the official doc only provides examples for caffe and uff model.

Although it is possible to compile onnxparser library based on the open source project onnx_trt [url]https://github.com/onnx/onnx-tensorrt[/url], I don’t think it is a good way to address my issue.

Hello,

unfortunately, we don’t currently have custom layer examples for onnx models. I will provide your feedback to the engineering team.

I meet same problem, official answer is replacing libonnxparser.so from onnx-tensorrt builded library

@xautlxf can you explain your answer a bit? So in order to use

IPluginFactory* createPluginFactory

for custom layers we need to replace

libonnxparser.so

in the tensorrt installation?

Hello,
Are there any news?
I’m also looking for an End2End example to implement and load an Onnx custom layer as a plugin…

Thanks,