TF-TRT

From this page on TF-TRT:
https://docs.nvidia.com/deeplearning/dgx/integrate-tf-trt/index.html#introduction

A tensoflow graph can be converted by TensorRT by using “trt.create_inference_graph” and convert trt_graph back to TensorFlow graph to run inference in Tensorflow inference pipeline.

All the examples I see on the page is using tensorflow python inference. My question is if we can use TensorFlow C++ inference with TF-TRT? If we can, is there any sample I can refer to?

Hello,

To use C++ TF-TRT api, may require building TensorFlow bazel source.

something like this:

  1. add /tensorflow/contrib/tensorrt:trt_engine_op_kernel as dependency to your BAZEL project

  2. Load the ops-library _trt_engine_op.so in your code using the C-API.