Description
I am trying to convert the saved_model format into TensorRT using google colab, for that, I’m referring to the post (Accelerating Inference in TensorFlow with TensorRT User Guide :: NVIDIA Deep Learning Frameworks Documentation).
And it is giving me the below error:
RuntimeError: Tensorflow has not been built with TensorRT support.
Converting the model on google colab is a proper way or do I need to use anaconda to install TensorRt and then convert it?
Thank you!
Hi,
Could you please make sure you installed tensorflow-gpu
.
If interested you can also use NVIDIA NGC containers for TensorFlow, which are built and tested with TF-TRT support enabled.
Thank you.
tensorflow-gpu is removed right
1 Like
Any updates on using Tensorflow 2.x? @neo21995 is right, tensorflow-gpu
has been removed since December 2022
Was this issue resolved? I’m getting the same error when I’m trying to run TensorRT on google colab…
This is my code…
//========================================================
!pip install pillow matplotlib
!pip install tensorflow
print("Tensorflow version: ", tf.version.VERSION)
Load a pre-trained model (e.g., ResNet50)
model = tf.keras.applications.ResNet50(weights=“imagenet”)
model.save(“/content/sample_data/resnet50.keras”) # Save the model
TensorRT conversion parameters
conversion_params = trt.DEFAULT_TRT_CONVERSION_PARAMS._replace(precision_mode=“FP16”)
converter = trt.TrtGraphConverterV2(
input_saved_model_dir=“resnet50_saved_model”,
conversion_params=conversion_params,
)
converter.convert()
converter.save(“resnet50_trt_model”)
print(“TensorRT optimized model saved!”)
//==========================================================
So the tensorflow version is 2.17.1
RuntimeError: Tensorflow has not been built with TensorRT support.
And this is the error I’m getting.
Thanks in advance!!!
Br,
Raghu
Hi @raghums534, You’ll need to use TensorFlow binaries that were compiled with TensorRT. The NVIDIA NGC containers for TensorFlow have been compiled in this way.
Best,
Sophie
Hi Sophie,
But the thing is, I’m trying to run the TensorRT program on the google colab, because I do not have a GPU on my host machine so google colab does not support docker containers and the compilation is not possible as provided in the link above, so how do I go about it when I’m trying to run the TensorRT on the google colab and not the host machine?
Br,
Raghu
Aaah I see - sorry I missed the colab requirement in your initial message. Let me reach out to the team and see if we have a solution!
Hi Sophie,
I was able to fix the issue, could you please give me the contact details of your team (Possibly email ID or something?) I would like to document it and the code if it part of like a open source so that it will be useful for the others who would try to run the Tensorflow TensorRT on google colab.
Thanks and Regards,
Raghu
Hi,
What is the update on this topic? It’s been quite a while …
Br,
Raghu
Hi All,
GitHub - Raghu-dev-pixel/Tensorflow_TRT: Corrected version of an image classification model using TensorFlow with TensorRT on Google Colab.
I contacted the owner of the Tensorflow/tensorRT, and Nvidia is no longer maintaining the repository, also the latest version of Tensorflow 2.18.0 does not support TensorRT. So I believe the final version of Tensorflow with TensortRT support is 2.17.0, I have implemented a working code for anyone who would like to use TensorFlow TensorRT on Google Colab for reference. The code is available in the repository mentioned above.
Thanks and Regards,
Raghu