Hi,
CUDA: 10.0
TensorRT: 5.0
CUDNN: 7.3
it seems that we can not have two reduction layers with SUMSQ operation.
for example, two of the reduction layers in my prototxt
layer {
name: "reduction_sumsq"
type: "Reduction"
bottom: "474"
top: "475"
reduction_param {
operation: SUMSQ
axis: 3
}
}
layer {
name: "emb_l2_norm"
type: "Reduction"
bottom: "480"
top: "481"
reduction_param {
operation: SUMSQ
axis: 1
}
}
tensorRT throws
Warning: The Reduce layer does not discard reduced dimensions. The reduced dimensions are treated as dimensions of size one in the output of the Reduce layer.
Warning: The Reduce layer does not discard reduced dimensions. The reduced dimensions are treated as dimensions of size one in the output of the Reduce layer.
I1017 17:57:03.894414 8845 pluginFactory.cpp:6] find gnap
Warning: The Reduce layer does not discard reduced dimensions. The reduced dimensions are treated as dimensions of size one in the output of the Reduce layer.
I1017 17:57:03.894479 8845 pluginFactory.cpp:6] find normalize
ERROR: Repeated layer name: reductionLayer/elementWiseLayer (layers must have distinct names)
F1017 17:57:03.894611 8845 baseEngine.cpp:67] Check failed: engine
*** Check failure stack trace: ***
this is definitely a bug, since I have no layer named reductionLayer/elementWiseLayer.
any idea?