TensorRT does not support permute in N (batch) dimension

In my project, I use a permute layer in caffe prototxt. I wanted to permute the shape NCHW (batchchannelheightwidth) to WNC*H, but when I transformed this caffe model to tensorrt model, the nvcaffeparser1::ICaffeParser::parse() function throw the following error:
TensorRT does not support permute in N (batch) dimension, and order index must be within the tensor dimensions.

Could you tell me how to solve this problem? Should I implement a new layer in tensorrt using the IPlugin layer or are there any other solution in tensorrt?


This is not possible in TensorRT. N must be the outermost dimension.

TensorRT assumes batching is batching: i.e. you execute the same pipeline for each element of the batch.

Or, you can split the network here to make two TRT networks with custom code in between.

Thank you!
Another 2 questions:

  1. Does the TRT support the axis:2 in Fullyconnected layer, which is supported in Innerproduct Layer of caffe. If not, any plans in future?
  2. Could I reshape NCHW to (NC)HW*1 by a Reshape Layer in tensorRT?