Is there a way to do weight normilzation in tensorrt?

Hi,

I have a custom layer that I want to try and replicate in tensorrt. It is basically a weight normilization operation.

class Conv2DWeightNorm(tf.layers.Conv2D):

  def build(self, input_shape):
    self.wn_g = self.add_weight(
        name='wn_g',
        shape=(self.filters,),
        dtype=self.dtype,
        initializer=tf.initializers.ones,
        trainable=True,
    )
    super(Conv2DWeightNorm, self).build(input_shape)
    square_sum = tf.reduce_sum(
        tf.square(self.kernel), [0, 1, 2], keepdims=False)
    inv_norm = tf.rsqrt(square_sum)
    self.kernel = self.kernel * (inv_norm * self.wn_g)

The idea is basically it normilizes the gradient to length 1, preserving only the direction.

Currently i;m trying to port the model using the UFF parser and the experience has been a nightmare. Its giving me the error

[TensorRT] ERROR: UFFParser: Parser error: model_lr/0/conv2d_weight_norm/Sum: axes.size() != 0 not yet supported for reduce constant nodes

I also can’t seem to set dimensions with -1 size (for variable size in fully convolutional networks) but I guess thats another question.

Thanks