Tensorrt inference slower than tensorflow

Description

i created a engine (.pb>.onnx>.plan) , however inference time is slower than inference using tensorflow model (.pb)

Environment

XAVIER
TensorRT Version: 5.1.6
CUDA Version: 10.0
CUDNN Version: 7.5.0
Operating System + Version:
Python Version: 3.6
TensorFlow Version: 1.14.0

Hi @awatef.edhib
We recommend you to try the latest TRT release for improved performances.
https://developer.nvidia.com/nvidia-tensorrt-7x-download

Thanks!

thanks for your reply
the problem is that the vector of the characteristics (of image) resulting from the engine is of size 73984 which takes a lot of time when calculating the distance between 2 vectors. on the other hand the vector resulting from tf model is of size 512

Hi @awatef.edhib,
TRT 5 is an old release and may have performance issues, hence we suggest you to try the latest TRT release.
If the issue persist, we request you to share your model and script so that we can elp you further.

Thanks!