TensorRT

Description

There is no PRELU Activation function layer in Tensor RT 8. Can Anyone suggest a way to add PRELU Activation function operation using TensorRT Python API?

Environment

TensorRT Version : 8.0
GPU Type :
Nvidia Driver Version : 470.57.02
CUDA Version : 11.3
CUDNN Version : 8.2
Operating System + Version : ubuntu20.04
Python Version (if applicable) : 3.8
TensorFlow Version (if applicable) : 2.5
PyTorch Version (if applicable) :
Baremetal or Container (if container which image + tag) :

Relevant Files

Please attach or include links to any models, data, files, or scripts necessary to reproduce your issue. (Github repo, Google Drive, Dropbox, etc.)

Steps To Reproduce

Please include:

  • Exact steps/commands to build your repro
  • Exact steps/commands to run your repro
  • Full traceback of errors encountered

Hi,
Please refer to below links related custom plugin implementation and sample:

While IPluginV2 and IPluginV2Ext interfaces are still supported for backward compatibility with TensorRT 5.1 and 6.0.x respectively, however, we recommend that you write new plugins or refactor existing ones to target the IPluginV2DynamicExt or IPluginV2IOExt interfaces instead.

Thanks!