Save serialized TF-TRT engine to reuse in Deepstream

Thanks for the reply @AakankshaS but I meant that I wanted to do something similar like the trtexec command by using tf-trt.

In the end I managed to convert my Deeplabv3+/Mobilenetv3 model by using tf2onnx and then using trtexec to convert the onnx model to a trt engine.
It needed 3 adaptions in case somebody is trying to do the same:

  1. Fixed input dimensions of the neural network
  2. Opset of at least 10 in the TF->Onnx step to support mobilenetv3 layers
  3. Change uint8 layers to another datatype using graphsurgeon because uint8/16 is not supported for the onnx->tensorrt conversion
2 Likes