Tf-trt vs trt with mnist model

Hardware Platform: DRIVE AGX Pegasus

Hello,
I found that tensorRT of Mnist model(“hello world” model in Machine Learning) is faster than tftrt mnist model almost 3 times. I have been reading some of the posts and it looks like standalone trt performs better than tf-trt.

  1. What is the reason behind that? Does this statement always hold true, for all the cases?
  2. If that is the case, is it possible to convert every tf-trt model to trt by writing using plugin API for unsupported layers?

Dear @roshanchaudhari,

If that is the case, is it possible to convert every tf-trt model to trt by writing using plugin API for unsupported layers

TF-TRT framework generates Tensorflow to TRT model directly. I am not sure what you mean by TF-TRT -> TRT model? If you have TF model, you can generate TRT model by implementing plugins layers. When TF model is converted to TRT models, it involves several optimzations such as layer fusion and removal etc which improves the performance.

Also, Could you confirm if it is TF Model or TF-TRT framework coneverted model for MNIST?

When I say tf-trt, my code looks like: “from tensorlfow.python.compiler.tensorrt import trt_convert as trt”
and when I say only tensorRT, it is : “Import tensortrt as trt”.
Let me know if it is not clear.

Dear @roshanchaudhari,
Ok. TRT is usually faster than tf-trt because tf-trt breaks the tf graph into several trt sub-graphs and tf sub-graphs when they are unsupported ops, which hurts the performance. However, we can support these ops with plugins and run the whole graph as single trt engine.

Note that TRT python bindings are not available on DRIVE AGX platform. You can use TRT c++ APIs on DRIVE AGX platform once you have TRT model for inferencing.

Got it.
Also, it looks like for parsers, uff and Caffe are deprecated in latest versions.
Does onxx backend in tensorRt has support for all the layers and with parser too? I see that there are still many issues in onxx.

Dear @roshanchaudhari,
Yes. You are right. We are updating new ops in ONNX parser. You can implement non supported layers as a plugin. Let us know if you have any issues with ONNX parser.

Great. Thanks.! We can close this topic.