Hey team,
I know that PReLU is not currently supported by TensorRT, but I wonder if there is a way to avoid these layers and only accelerate the layers supported by TensorRT?
Or how to use the Plugin Layer to integrate the custom Layer? Any examples?