TF-TRT Errors fetching dynamic library


Experiencing Failures loading dynamic libraries when trying to carry out TF-TRT conversion:

2021-10-21 16:13:57.478648: W tensorflow/stream_executor/platform/default/] Could not load dynamic library ‘’; dlerror: cannot open shared object file: No such file or directory

2021-10-21 16:13:57.478672: F tensorflow/compiler/tf2tensorrt/stub/] getInferLibVersion symbol not found.

Aborted (core dumped)

I have been having significant trouble in general trying to update a workflow that involves generating models in tensorflow and converting them to tensorRT. It was previously using conversions of frozen graphs to uff format. I am trying to update to tensorflow 2.0 where this is no longer supported and TF-TRT has been the recommended workflow. Any further advice appreciated. (I have achieved conversion of the model to onnx)


TensorRT Version: 8.2.0-1 (also reproduced on seperate environment with
GPU Type: 2080 ti
Nvidia Driver Version: 470.57.02
CUDA Version: 11.4
CUDNN Version: (I Think, difficult to check)
Operating System + Version: Ubuntu 20.04
Python Version (if applicable): 3.8
TensorFlow Version (if applicable): 2.6.0
PyTorch Version (if applicable):
Baremetal or Container (if container which image + tag):

Relevant Files

The issue has been reproduced with this test case model : TensorFlow 2 quickstart for beginners  |  TensorFlow Core

Steps To Reproduce

import tensorflow as tf
from tensorflow.python.compiler.tensorrt import trt_convert as trt

input_saved_model_dir = “/opt/transfer/models/noopttest/”
conversion_params = trt.DEFAULT_TRT_CONVERSION_PARAMS
conversion_params = conversion_params._replace(
conversion_params = conversion_params._replace(precision_mode=“FP16”)
conversion_params = conversion_params._replace(

converter = trt.TrtGraphConverterV2(

Please check the below links, as they might answer your concerns.

Thank you for these links, the trtexec command line tool looks like a useful way to test out my onnx models.

However, I am not sure that this addresses the TF-TRT conversion error at all. Is it recommended that I pursue only the conversion to onnx format and building and running on tensorRT using that format?

It is also unclear whether a model generated on a particular TF and ONNX version should be able to be built and run inference on other versions?


We recommend you to check the below samples links in case of tf-trt integration issues.

If issue persist, We recommend you to reach out to Tensorflow forum.


Those links were not relevant to the issue I was having. Followed up with a request on the tensorflow forum here:
TF-TRT: No Support for TensorRT v8? - General Discussion - TensorFlow Forum

Potential solution contained in this thread. Based on TensorRTv8 Breaking the API. There is an in progress Draft Pull Request for a fix on 8.2 linked in that thread for anyone experiencing similar issues.


Sorry for that. Could you please share us complete error logs.

Thank you.