Can't include more than two Reduction layer with SUMSQ

Hi,

CUDA: 10.0
TensorRT: 5.0
CUDNN: 7.3

it seems that we can not have two reduction layers with SUMSQ operation.

for example, two of the reduction layers in my prototxt

layer {
  name: "reduction_sumsq"
  type: "Reduction"
  bottom: "474"
  top: "475"
  reduction_param {
    operation: SUMSQ
    axis: 3
  }
}


layer {
  name: "emb_l2_norm"
  type: "Reduction"
  bottom: "480"
  top: "481"
  reduction_param {
    operation: SUMSQ
    axis: 1
  }
}

tensorRT throws

Warning: The Reduce layer does not discard reduced dimensions. The reduced dimensions are treated as dimensions of size one in the output of the Reduce layer.
Warning: The Reduce layer does not discard reduced dimensions. The reduced dimensions are treated as dimensions of size one in the output of the Reduce layer.
I1017 17:57:03.894414  8845 pluginFactory.cpp:6] find gnap
Warning: The Reduce layer does not discard reduced dimensions. The reduced dimensions are treated as dimensions of size one in the output of the Reduce layer.
I1017 17:57:03.894479  8845 pluginFactory.cpp:6] find normalize
ERROR: Repeated layer name: reductionLayer/elementWiseLayer (layers must have distinct names)
F1017 17:57:03.894611  8845 baseEngine.cpp:67] Check failed: engine
*** Check failure stack trace: ***

this is definitely a bug, since I have no layer named reductionLayer/elementWiseLayer.

any idea?

Hello,

Looks like a bug in the parser.

Workaround for now is to use some other layer in place of Reduce.

We cannot share more information about further release here. Please pay attention to our announcement for the information.

I have no idea where we can get “operation”? In other words, where can I add the piece of the code you provided?

My apologies. The code snippet wasn’t intended to be part of the answer.

Hello,

Fix has been committed to a future TRT release. Please stayed tuned for announcement.