TF-TRT Integration: 'Unknown input node' reported while using create_inference_graph

Hi, all.
I’m trying to convert a TensorFlow graph to a TensorRT inference graph using the python API ‘create_inference_graph’ in ‘tensorflow.contrib.tensorrt’, and these warnings came up.

W tensorflow/contrib/tensorrt/convert/convert_graph.cc:794] Failed to register segment graphdef as a function 1: Invalid argument: Node 'polymath1_model_Polymath1_Model_1_0_2/modeling_layer/layer1/bw/bw/while/dropout/random_uniform/sub': Unknown input node '^polymath1_model_Polymath1_Model_1_0_2/modeling_layer/layer1/bw/bw/while/Identity'
W tensorflow/contrib/tensorrt/convert/convert_graph.cc:794] Failed to register segment graphdef as a function 7: Invalid argument: Node 'polymath1_model_Polymath1_Model_1_0_2/embedding_layer_5/bidirectional_rnn/fw/fw/while/dropout/random_uniform/sub': Unknown input node '^polymath1_model_Polymath1_Model_1_0_2/embedding_layer_5/bidirectional_rnn/fw/fw/while/Identity'
W tensorflow/contrib/tensorrt/convert/convert_graph.cc:794] Failed to register segment graphdef as a function 8: Invalid argument: Node 'polymath1_model_Polymath1_Model_1_0_2/embedding_layer_4/bidirectional_rnn/fw/fw/while/dropout/random_uniform/sub': Unknown input node '^polymath1_model_Polymath1_Model_1_0_2/embedding_layer_4/bidirectional_rnn/fw/fw/while/Identity'

But these unknown input names without the symbol “^” actually exist in the GraphDef.
And then, errors like this came out after that:

E tensorflow/contrib/tensorrt/log/trt_logger.cc:38] DefaultLogger Parameter check failed at: Network.cpp::addInput::281, condition: isIndexedCHW(dims) && volume(dims) < MAX_TENSOR_SIZE

I’m not sure whether the warnings caused the errors directly or not, or they are just 2 different problems to solve?

I tried 2 approaches to generate the protobuf(.pb) binary, one is using the ‘freeze_graph’ tool, and the other one is calling ‘convert_variables_to_constants’ in ‘tf.graph_util’ straight after trained graph is ready. The generated binaries both led to the same situation.

Versions of crucial libraries:
tensorflow-gpu (1.10.0)
tensorrt (3.0.4)
(https://devblogs.nvidia.com/tensorrt-integration-speeds-tensorflow-inference/ this sample works.)

Thanks in advance!

hello,

Unfortunately, TensorRT 3 doesn’t support TF 1.10. Please update to TRT 4. and using TF 1.9.