ONNX to TensorRT Engine conversion issue

Hi,

I have converted my mxnet model to Onnx format,
Now wanted to do the infrencing using TensorRt. So used onnx-tensorrt project to do so but stuck at below error

==========================================

Input filename: resnet100.onnx
ONNX IR version: 0.0.3
Opset version: 8
Producer name:
Producer version:
Domain:
Model version: 0
Doc string:

Parsing model
While parsing node number 6 [PRelu → “relu0”]:
ERROR: /home/nano/Pallab/TRT/onnx-tensorrt/ModelImporter.cpp:143 In function importNode:
[8] No importer registered for op: PRelu

============================================

TensorRt version:- 5.1.6.1
Device:- Jetson Nano,

Going to the various forum I learned that Prelu op support is missing and can be added as a plugin.
What could be the best approach to do the same, So that TensorRT can be used for infrencing.

Regards
Pallab Sarkar

Hi,
Please reference the steps described here for extending TRT with custom layer using plugin APIs : https://docs.nvidia.com/deeplearning/sdk/tensorrt-developer-guide/index.html#extending

Thanks