How to convert Tensorflow model to Tensorrt?

Hi,

May I know do you use Nano 4GB or Nano 2GB?
Since Nano has limited resources, it can not inference a complicated model that requires too much memory.

We are sorry that it seems to inference a TensorFlow model with TensorRT is tricky and non-friendly.
That’s because TensorRT doesn’t directly support the TensorFlow model but requires some intermediate format.

So if you can convert the model into ONNX format,
you should be able to run it with trtexec to get some performance score.

To convert a TensorFlow into ONNX, you can try the tf2onnx library.
Please remember to generate the model with a fixed input size, or you will need to handle some dynamic shape issue.

After that, you can run the onnx model with our built-in binary trtexec.

/usr/src/tensorrt/bin/trtexec --onnx=[your/file]

Or you can check the below topic for a python script to inference with TensorRT(ONNX input):

Thanks.