Simple model which parses NCHW but not NHWC from UFF

I’m using tensorRT 5.1.5 on windows with cuda 9.0 and cudnn7.5, tensorflow 1.7.0

To summarize, I have a model with a convolution and a concatenation (skip connection). It parses just fine when I build it (in tensorflow) in NCHW, but when I build it NHWC, the parser adds an extraneous transpose which breaks the concatenation. It parses in NHWC order without the concatenation. Other skip connections like and addition seem to work fine.

More detail:

I have the following model
x0 is the input
x1 = 2d convolution of x0
x2 = concatenate x0 and x1 along the channels axis
x2 is the output

I created this model in tensorflow, and exported it as UFF.
If build my model in NCHW data order, it parses fine. If I build my model in NHWC,
it adds extra transpose layers when it parses (as expected), but it does so incorrectly and the model fails to parse. It does work fine in NHWC order if I don’t have the concatenate layer at the end

The way it seems to build the network while parsing is the following
x0 is the input in NHWC order
x0_shuffle = transpose x0 to NCHW (the parser adds this layer)
x1 = 2d convolution of x0_shuffle (this is fine)

but then before passing to the concatenation layer, it transposes x0_shuffle again! (so it’s back in NCHW), but now the concatenation won’t work (dimensions are mismatched)

I will attach two UFFs. The input has 4 channels, and HW is 96x96. The input is called “input_1”, output is called “out”.

edit: reattaching the uffs, I made a mistake in one of them

uff.rar (2.25 KB)

The model parses correctly if I do an operation to x0 before passing it to the concatenation (like multiply by 1). This seems like a bug, I could provide more details if anyone is interested…

I have met the same problem,have you resolved it yet?

Unfortunately not, I only have a hack-y workaround, which is to multiply by 1 immediately before passing it to the concatenation, something like tf.concat([1*x0, x1], axis=3) (I was building my model with tensorflow)

But personally I just switched to doing channels first