Reverse Sequence operation not supported


LSTM is supported in TensorRT, but the Reverse Sequence operation of LSTM is not supported in TensorRT. Why so?

If the LSTM layer is supported then it’s operations should also be supported, Am I Right?

TensorRT Version: 7
OS: Ubuntu 18.04

Hello @aaryan, Reverse Sequence operation are supported in LSTM.
Can you please help us with the model and script which is failing for you.

Can you Help me in this why I’m getting this error if Reverse Sequence is supported?

ModelImporter.cpp:135: No importer registered for op: ReverseSequence. Attempting to import as plugin.
[06/22/2020-17:20:59] [I] [TRT] builtin_op_importers.cpp:3556: Searching for plugin: ReverseSequence, plugin_version: 001, plugin_namespace:
[06/22/2020-17:20:59] [E] [TRT] INVALID_ARGUMENT: getPluginCreator could not find plugin ReverseSequence version 001
ERROR: builtin_op_importers.cpp:3558 In function importFallbackPluginImporter:
[8] Assertion failed: creator && “Plugin not found”

Here’s Onnx file:

Here’s model architecture:

inputs = Input(shape=(32,128,1))

convolution layer with kernel size (3,3)

conv_1 = Conv2D(64, (3,3), activation = ‘relu’, padding=‘same’)(inputs)

poolig layer with kernel size (2,2)

pool_1 = MaxPool2D(pool_size=(2, 2), strides=2)(conv_1)

conv_2 = Conv2D(128, (3,3), activation = ‘relu’, padding=‘same’)(pool_1)

pool_2 = MaxPool2D(pool_size=(2, 2), strides=2)(conv_2)

conv_3 = Conv2D(256, (3,3), activation = ‘relu’, padding=‘same’)(pool_2)

conv_4 = Conv2D(256, (3,3), activation = ‘relu’, padding=‘same’)(conv_3)

poolig layer with kernel size (2,1)

pool_4 = MaxPool2D(pool_size=(2, 1))(conv_4)

conv_5 = Conv2D(512, (3,3), activation = ‘relu’, padding=‘same’)(pool_4)

Batch normalization layer

batch_norm_5 = BatchNormalization()(conv_5)

conv_6 = Conv2D(512, (3,3), activation = ‘relu’, padding=‘same’)(batch_norm_5)

batch_norm_6 = BatchNormalization()(conv_6)

pool_6 = MaxPool2D(pool_size=(2, 1))(batch_norm_6)

conv_7 = Conv2D(512, (2,2), activation = ‘relu’)(pool_6)

squeezed = Lambda(lambda x: K.squeeze(x, 1))(conv_7)

bidirectional LSTM layers with units=128

blstm_1 = Bidirectional(LSTM(128, return_sequences=True, dropout = 0.2))(squeezed)

blstm_2 = Bidirectional(LSTM(128, return_sequences=True, dropout = 0.2))(blstm_1)

outputs = Dense(len(char_list)+1, activation = ‘softmax’)(blstm_2)

model to be used at test time

model = Model(inputs, outputs)

Hi ,
ReverseSequence operation is not supported in tensorRT.
However the team is working to fix this.
Apologies for the confusion created.


I also meet the same problem. have you solved this problem? if possible, would you mind sharing your solutions?@ aaryan