CRNN freeze model can not be loaded

Hi, I want to use TensorRT to optimize my inference model which is a CRNN architecture trained in keras, I can generate the tensorflow.pb file using the keras model, but when I try to load the tensorflow.pb file to generate tensorrt model, the error as below was happened:

ValueError: Input 0 of node bi_rnn1_1/while/ReadVariableOp/Enter was passed float from bi_rnn1_1/forward_gru_1/kernel:0 incompatible with expected resource.

I have reference some topic :
https://devtalk.nvidia.com/default/topic/1050006/incompatible-with-expected-resource/

but it not work

Nvida xavier
tensorflow 1.12.0
keras 2.2.4
cuda 10.0
cudnn 7.5.4

and I think there is something wrong with rnn because I can generate a tensorRT engine using CNN model as fine.

Do you have any idea to solve these question?

the detail of code was attached in the below

thank you



Hi,

Looks to similar to below issue:
https://github.com/onnx/tensorflow-onnx/issues/77

It may also occur if different TF version is used.
Will recommend to use the same TF version of the .pb file.

Thanks