I tried to accelerate tensorflow attentionocr recently, it has several layers unsupported, so I written plug in and register in tensorrt namespace. other layers works fine, but the split plug in causes error.
TRT Error Repeated tensor name: AttentionOcr_v1/sequence_logit_fn/SQLR/LSTM/attention_decoder/lstm_cell/split_1
But I check the pb and uff, there is no node with same name split_1.
Could you help me what the problem is and how to solve the problem. Thanks.
Could you please provide details on the platforms you are using:
o Linux distro and version
o GPU type
o Nvidia driver version
o CUDA version
o CUDNN version
o Python version [if using python]
o Tensorflow and PyTorch version
o TensorRT version
If possible, please share the script & model file along with the error log to reproduce the issue.
I break the LSTM up into several parts and write the plug in of unsopprted op such as clip_by_value/split. In the basic LSTM cell, there is 10 split, when running the first split in trt, there is no problem. However, running the second split it comes out the error:
TRT Error Repeated tensor name: AttentionOcr_v1/sequence_logit_fn/SQLR/LSTM/attention_decoder/lstm_cell/split_1
I am not able to access the zip file, could you please upload it in forum itself?
Meanwhile, could you please check if all tensors have distinct names after 10 splits? This error normally occurs when you have same tensor names repeated in the model.
Hi, Could you please provide me a tensorflow attention ocr model to onnx conversion script ?? I am facing the conversion issue with unsupported layers. I want to run inference on jetson nano using tensorrt. Could you provide me the plugin file and conversion scripts. it would be helpful.
Thank you