ONNX Transpose giving error while converting to tensorrt engine

My question is relevant to this:-
I am trying to convert .onnx file that I had created from chainer model to trt engine but i get the below error.

onnx2trt …/embedding/mobinet.onnx -o …/embedding/mobinet.trt -b 10

Input filename: …/embedding/mobinet.onnx
ONNX IR version: 0.0.4
Opset version: 9
Producer name: Chainer
Producer version: 6.1.0
Model version: 0
Doc string:

WARNING: ONNX model has a newer ir_version (0.0.4) than this parser was built against (0.0.3).
Parsing model
While parsing node number 1 [Transpose -> “Transpose_0”]:
ERROR: /home/onnx-tensorrt/builtin_op_importers.cpp:1942 In function importTranspose:
[8] Assertion failed: transposeWeights(weights, perm, &new_weights)

So, my question is as it is given in this link that https://docs.nvidia.com/deeplearning/sdk/tensorrt-support-matrix/index.html for ONNX Transpose operation is being supported by the tensorrt why is this giving me error.