Import pytorch model in TensorRT and add custom plugin layer

I have two questions

  1. I want to import pytorch model into tensorrt, how can I do that?
  2. I want to add custom layers (plugins) in the model that I want to import, layers include upsampling layer which has to be added between layers or in the network, how can i do that after importing model from pytorch


To import pytorch to tensorrt: you’d save your pytorch model as an ONNX file, then import your model into TRT.

via Python:
Via C++:

Take a look at this sample:

Regarding custom layer: Users can extend TensorRT functionalities by implementing custom layers using the IPluginV2 class for the C++ and Python API. Please reference


TensorRT 5 GA only provides the examples of IPluginV2 for caffe/tensorflow. But how to implement custom layers for onnx models?

I need some examples, thanks!

Hi,I also find this problem too.How can i implement custom layers for onnx models,I just need to add PRelu layer.

Hey qwe258sj,
Were you able to register your custom Plugin through nvonnxparser?

Hi ebraheem,
I don’t find it yet.

Hi, for onnx models,I need a example using the c++ IPluginV2 class for add a layer …

@NVES, could you please answer above questions? I think add custom layer in onnx-tensorrt is a common demand.