RuntimeError: Tensorflow has not been built with TensorRT support

Description

I am trying to convert the saved_model format into TensorRT using google colab, for that, I’m referring to the post (Accelerating Inference in TensorFlow with TensorRT User Guide :: NVIDIA Deep Learning Frameworks Documentation).
And it is giving me the below error:
RuntimeError: Tensorflow has not been built with TensorRT support.

Converting the model on google colab is a proper way or do I need to use anaconda to install TensorRt and then convert it?

Thank you!

Hi,

Could you please make sure you installed tensorflow-gpu.
If interested you can also use NVIDIA NGC containers for TensorFlow, which are built and tested with TF-TRT support enabled.

Thank you.

tensorflow-gpu is removed right

Any updates on using Tensorflow 2.x? @neo21995 is right, tensorflow-gpu has been removed since December 2022