Does TensorRT support leaky relu ?

Hi, where can i find the information of what functions are supported by TensorRT.

Because my model uses leaky relu and there is no warning while convert pb model to uff file but the prediction of converted model is not good and that makes me wonder maybe TensorRT doesn’t have leaky relu ?

my environment is TensorRT 5.0.6.3 on Jetson Nano.

i also want to ask when i use uff.from_tensorflow_frozen_model to convert pb to uff, does it always gives you a warning if there is a function that TensorRT does not support ?

i have convert a model that has abs function but there is no warning while convert it to uff, until i run the model in the program and it shows there is no net/Abs.

Thanks for any response.

Hi,

We list our support matrix in detail here:
https://docs.nvidia.com/deeplearning/sdk/tensorrt-support-matrix/index.html

Thanks.

@AastaLLL Thanks.

According to the link, LeakyReLU is supported in TensorRT parser that means it can be converted from pb to uff correctly and i don’t need to write a custom plugin for it ?

Thanks.

Hey @lens828

please take a look here, i was able to convert LRelu using standard plugin

https://devtalk.nvidia.com/default/topic/1055138/tensorrt/error-parsing-uff-model-quot-unsupported-operation-_leakyrelu-quot-/post/5348883/#5348883
Thanks

@r7vme Thanks for the answer.

i use the way you say

dynamic_graph.collapse_namespaces(
        {
            "net/conv1/LeakyRelu":gs.create_plugin_node(name="trt_lrelu1", op="LReLU_TRT", negSlope=0.1),
            "net/conv2/LeakyRelu":gs.create_plugin_node(name="trt_lrelu2", op="LReLU_TRT", negSlope=0.1),
            "net/conv3/LeakyRelu":gs.create_plugin_node(name="trt_lrelu3", op="LReLU_TRT", negSlope=0.1),
            "net/conv4/LeakyRelu":gs.create_plugin_node(name="trt_lrelu4", op="LReLU_TRT", negSlope=0.1),
        }

but it shows

Warning: No conversion function registered for layer: LReLU_TRT yet.

it seems there is no standard plugin for LReLU_TRT ?

it is really thanks for your answer but i have to say there is a lot of problem while using TensorRT library.

Thanks.

Hey, in general this warning can be ignored.

You should register plugins in uff_to_plan.cpp that is used in the next step

+#include <NvInferPlugin.h>
...
+  bool ok = initLibNvInferPlugins(&gLogger, "");
+  if (!ok) { return 1; }

Thanks @r7vme

i did not keep doing the next step because this warning but just like you say, it works.

Thanks.