TensorRT Loading Onnx Model Error with LSTM/recursion/loop


I have a saved model which I converted to Onnx using tf2onnx, as follows: python3 -m tf2onnx.convert --fold_const --saved-model saved_model/ --output model.onnx. The model contains unidirectional LSTM layers. I am then importing this model into tensorRT. The parsing of the model completes properly however I then get the following error when TensorRT tries to validate the network:

(Unnamed Layer* 155) [Recurrence]: recurrence inputs must have same type
Loop API is not supported on the requested platform.
Network validation failed.

Any advice would be greatly appreciated.


TensorRT 7 :
CUDA 10.2 :
Ubuntu 18.04 64 bit :

1 Like

After further investigation I also noticed the following warnings being printed by the onnx2trt converter.

onnx2trt_utils.cpp:283: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32.
onnx2trt_utils.cpp:309: One or more weights outside the range of INT32 was clamped

It seems to be properly converting them to a supported INT32 type by TRT, however there may be a correlation between this message and the bug.

1 Like