Relu negative slope is not supported in Resnet34

When using TensorRT to deploy Resnet34, it reports error as below:

"
Caffe Paser: Relu negative slope is not supported
error parsing layer type Relu index 4
"

Doesn’t TensorRT support leaky Relu now?

In our models, leaky Relu is widely used

Hi,

TensorRT will support Leaky relu from v5.1.
If acceptable, you can update your lrelu to relu+scale.

Thanks.

Hi AastaLLL,

So when will v5.1 be released?

Leaky Relu does better in our dataset and models comparing to Relu, so will not change to Relu

Hi,

You can use relu + scale to approximate.
In general, it can give you the same result.

Thanks.

Hi

(i have replied twice, but i don’t know why it says error and delete my reply!!! )

The thing is that we widely use leaky Relu in our models, according to your ways, we will modify it to Relu and re-train the models to check the results, then modify it back to leaky Relu in your next version, our guys think it is not worth the effort.

Another thing is that other platforms support leaky Relu

Hi,

This should not be an issue.

Leaky relu, relu and scaling are all weight-free layers.
It is controlled by the hyperparameter.

Thanks.

Hi,

1 )So what is the scale in relu + scale? we talked, but don’t understand

Please provide specific solution

  1. When you get v5.1 release data, please tell me

Hi,

1. For example, y = max(0.01x,x)

yi = xi     if xi >0       -> generate this with standard relu
     xi*ai  if xi <0       -> generate this with scale layer, scaling factor is ai

2. My colleague have sent you a private message. Please check.

Thanks.

Hi

We have tied max(x, a*x), now the model can be supported, we will check the final results of the model later, thanks