Why doesn't TensorRT support transpose and RNN?

Why doesn’t TensorRT support transpose and RNN? If there is no plan on supporting these ops in the short term, then I am planning on implementing a custom CUDA layer for TensorRT.

So sad to hear that, I met the error:
[Transpose]:
ERROR: builtin_op_importers.cpp:1928 In function importTranspose:
[8] Assertion failed: perm.order[BATCH_DIM] == BATCH_DIM

It means I have to give up tensorRT?

My question is relevant to this:-
I am trying to convert .onnx file that I had created from chainer model to trt engine but i get the below error.

onnx2trt …/embedding/mobinet.onnx -o …/embedding/mobinet.trt -b 10


Input filename: …/embedding/mobinet.onnx
ONNX IR version: 0.0.4
Opset version: 9
Producer name: Chainer
Producer version: 6.1.0
Domain:
Model version: 0
Doc string:

WARNING: ONNX model has a newer ir_version (0.0.4) than this parser was built against (0.0.3).
Parsing model
While parsing node number 1 [Transpose -> “Transpose_0”]:
ERROR: /home/onnx-tensorrt/builtin_op_importers.cpp:1942 In function importTranspose:
[8] Assertion failed: transposeWeights(weights, perm, &new_weights)

So, my question is as it is given in this link that https://docs.nvidia.com/deeplearning/sdk/tensorrt-support-matrix/index.html for ONNX Transpose operation is being supported by the tensorrt why is this giving me error.

I also meet the error, do you solve it yet?