how could i add a lstm layer(set max_seq_length and set reverse weights)?

i use pytorch to build my lstm network, it looks like

class TestNet(torch.nn.Module):
        def __init__(self):
            super(TestNet, self).__init__()
            self.lstm = nn.LSTM(256,
                                256 // 2, 2,
                                batch_first=True, bidirectional=True)
            return
        def forward(self, x):
            self.lstm.flatten_parameters()
            res = self.lstm(x)
            return res

how could i set max_seq_length when call network.add_rnn_v2?

when i set max_seq_length = input_tensor.shape[0], got a error

[TensorRT] ERROR: Parameter check failed at: ../builder/Network.cpp::addRNNCommon::397, condition: input.getDimensions().d[di.seqLen()] == maxSeqLen

i can set weights by calling

layer.set_weights_for_gate

but, how could i set a reverse weights?

Could you please let us know if you are still facing this issue?

Thanks