Description
There is no PRELU Activation function layer in Tensor RT 8. Can Anyone suggest a way to add PRELU Activation function operation using TensorRT Python API?
Environment
TensorRT Version : 8.0
GPU Type :
Nvidia Driver Version : 470.57.02
CUDA Version : 11.3
CUDNN Version : 8.2
Operating System + Version : ubuntu20.04
Python Version (if applicable) : 3.8
TensorFlow Version (if applicable) : 2.5
PyTorch Version (if applicable) :
Baremetal or Container (if container which image + tag) :
Relevant Files
Please attach or include links to any models, data, files, or scripts necessary to reproduce your issue. (Github repo, Google Drive, Dropbox, etc.)
Steps To Reproduce
Please include:
- Exact steps/commands to build your repro
- Exact steps/commands to run your repro
- Full traceback of errors encountered