Can you Help me in this why I’m getting this error if Reverse Sequence is supported?
Error:
ModelImporter.cpp:135: No importer registered for op: ReverseSequence. Attempting to import as plugin.
[06/22/2020-17:20:59] [I] [TRT] builtin_op_importers.cpp:3556: Searching for plugin: ReverseSequence, plugin_version: 001, plugin_namespace:
[06/22/2020-17:20:59] [E] [TRT] INVALID_ARGUMENT: getPluginCreator could not find plugin ReverseSequence version 001
ERROR: builtin_op_importers.cpp:3558 In function importFallbackPluginImporter:
[8] Assertion failed: creator && “Plugin not found”
Here’s Onnx file: https://drive.google.com/file/d/10KWPTNSbAGbzmkfrOCuRhc_VFtUcVpWs/view?usp=sharing
Here’s model architecture:
inputs = Input(shape=(32,128,1))
convolution layer with kernel size (3,3)
conv_1 = Conv2D(64, (3,3), activation = ‘relu’, padding=‘same’)(inputs)
poolig layer with kernel size (2,2)
pool_1 = MaxPool2D(pool_size=(2, 2), strides=2)(conv_1)
conv_2 = Conv2D(128, (3,3), activation = ‘relu’, padding=‘same’)(pool_1)
pool_2 = MaxPool2D(pool_size=(2, 2), strides=2)(conv_2)
conv_3 = Conv2D(256, (3,3), activation = ‘relu’, padding=‘same’)(pool_2)
conv_4 = Conv2D(256, (3,3), activation = ‘relu’, padding=‘same’)(conv_3)
poolig layer with kernel size (2,1)
pool_4 = MaxPool2D(pool_size=(2, 1))(conv_4)
conv_5 = Conv2D(512, (3,3), activation = ‘relu’, padding=‘same’)(pool_4)
Batch normalization layer
batch_norm_5 = BatchNormalization()(conv_5)
conv_6 = Conv2D(512, (3,3), activation = ‘relu’, padding=‘same’)(batch_norm_5)
batch_norm_6 = BatchNormalization()(conv_6)
pool_6 = MaxPool2D(pool_size=(2, 1))(batch_norm_6)
conv_7 = Conv2D(512, (2,2), activation = ‘relu’)(pool_6)
squeezed = Lambda(lambda x: K.squeeze(x, 1))(conv_7)
bidirectional LSTM layers with units=128
blstm_1 = Bidirectional(LSTM(128, return_sequences=True, dropout = 0.2))(squeezed)
blstm_2 = Bidirectional(LSTM(128, return_sequences=True, dropout = 0.2))(blstm_1)
outputs = Dense(len(char_list)+1, activation = ‘softmax’)(blstm_2)
model to be used at test time
model = Model(inputs, outputs)