TensorRT: No conversion function registered for layer: Neg yet.

I’m using tf.negative as part of a parameterized ReLU in tensorflow. I’m getting an error in conversion of

Warning: No conversion function registered for layer: Neg yet.
Converting as custom op Neg pnet/PReLU1/Neg
name: “pnet/PReLU1/Neg”
op: “Neg”
input: “pnet/conv1/BiasAdd”
attr {
key: “T”
value {
type: DT_FLOAT
}
}

I’m using TensorRT4. The docs clearly say that TensorFlow negative is supported. Any thoughts on this one?

which version of tensorflow are you using?

The newest version of tensorflow, 1.10.

Hello samsonf89m5,

The supported op name in the converter is “Negative”, the code snippet is using “Neg” so it says it is unsupported.   The converter expects the op to be called Negative. It’s possible that the name has change in latest version of TF.    You are using Tensorflow 1.10.   Unfortunately, TensorRT 4.0 doesn’t support TF 1.10 yet.

For using leaky relu, the user can use provided LReLU plugin.

Looks like there is support for TF 1.7, and probably TF 1.8 – got it.

For using leaky relu, the user can use provided LReLU plugin.

Can you send me a pointer to it? When I look here, I don’t see lrelu:
https://docs.nvidia.com/deeplearning/sdk/tensorrt-developer-guide/index.html

Thanks!

– Samson

Hello,

For LReLU, please reference https://docs.nvidia.com/deeplearning/sdk/tensorrt-api/c_api/_nv_infer_plugin_8h_source.html

https://docs.nvidia.com/deeplearning/sdk/tensorrt-api/c_api/_nv_infer_plugin_8h.html#a3a76b4db492a6d2aabbc3b9ec9d73ed1