My tensorflow model has Depthwise_conv2d layer.
In tensorflow, ResizeArea that is input to Depthwise_conv2d has 1x240x320x19 shape.
So Depthwise_conv2d is operated between 1x240x320x19 shape input and 25x25x19x1 kernel filter in tensorflow.
When model is converted to TensorRT, ResizeArea is not available so I need plugin and plugin format is only NCHW. So in TensorRT, after ResizeArea plugin, tensor shape is 1x19x240x320 format. Then there is error in TensorRT network parser to ENGINE.
The error is
TensorRT] ERROR: gaussian_heatMat/depthwise: kernel weights has count 11875 but 7796358 was expected [TensorRT] ERROR: gaussian_heatMat/depthwise: count of 11875 weights in kernel, but kernel dimensions (25, 25) with 1459200 input channels, 19 output channels and 19 groups were specified. [TensorRT] ERROR: UffParser: Parser error: MarkOutput_0: Order size is not matching the number dimensions of TensorRT
How can I solve the problem?
I can’t transpose channel in Tensorflow before Depthwise_conv2d. Because plugin output is transpose again then dimension mismatch happens again.
Or how can I create Depthwise_conv2d layer in TensorRT? Can I have sample?
Then I make two TensorRT engines, so that no more NCHW/NHWC issue.