Using a TFTRT model in Tensorflow

Hi,

I’ve followed the examples from this post : https://devblogs.nvidia.com/tensorrt-integration-speeds-tensorflow-inference/ and from there : https://github.com/tensorflow/models/tree/6ff0a53f81439d807a78f8ba828deaea3aaaf269/research/tensorrt

I was able to run the scripts, but then I don’t know how to use the converted graphs.
When I try to display the graph with Tensorboard, I get this error:

Traceback (most recent call last):
  File "/usr/lib/python3.5/runpy.py", line 184, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.5/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/usr/local/lib/python3.5/dist-packages/tensorflow/python/tools/import_pb_to_tensorboard.py", line 76, in <module>
    app.run(main=main, argv=[sys.argv[0]] + unparsed)
  File "/usr/local/lib/python3.5/dist-packages/tensorflow/python/platform/app.py", line 125, in run
    _sys.exit(main(argv))
  File "/usr/local/lib/python3.5/dist-packages/tensorflow/python/tools/import_pb_to_tensorboard.py", line 58, in main
    import_to_tensorboard(FLAGS.model_dir, FLAGS.log_dir)
  File "/usr/local/lib/python3.5/dist-packages/tensorflow/python/tools/import_pb_to_tensorboard.py", line 49, in import_to_tensorboard
    importer.import_graph_def(graph_def)
  File "/usr/local/lib/python3.5/dist-packages/tensorflow/python/util/deprecation.py", line 432, in new_func
    return func(*args, **kwargs)
  File "/usr/local/lib/python3.5/dist-packages/tensorflow/python/framework/importer.py", line 418, in import_graph_def
    graph._c_graph, serialized, options)  # pylint: disable=protected-access
tensorflow.python.framework.errors_impl.NotFoundError: Op type not registered 'TRTEngineOp' in binary running on 425785a3e963. Make sure the Op and Kernel are registered 
in the binary running in this process. Note that if you are loading a saved graph which used ops from tf.contrib, accessing (e.g.) `tf.contrib.resampler` should be done b
efore importing the graph, as contrib ops are lazily registered when the module is first accessed.

When I try to load the graph in a notebook, I get a similar error.

It’s happening with the ResNet models provided in with the codes and also with a custom MobileNet-based model.

I’m using Tensorflow 1.9 built from source (also tried with Tensorflow 1.7 and 1.8), TensorRT4 (also tried TensorRT3), Python3.5, CUDA9.0 with Cudnn 7.0 and within a Docker container.

Does anyone have an idea how to solve this issue? Is it ok to use a converted graph with Tensorflow tools or can we only use it with TensorRT tools thereafter?

I also have another unrelated question: can we use TensorRT with models that don’t have a fixed shape? For now, I have not been able to use TensorRT with a model which doesn’t have a fixed shape.

Thank you in advance.

As stated in the error, TRTEngineOp is one of the lazily loaded ops. You need to import tensorflow.contrib.tensorrt in your import script. There is a pull request at github for it, but still waiting.

I’m getting the exact same error. Importing tensorflow.contrib.tensorrt does not help. Is there a solution yet?