Could not load dynamic library 'libnvinfer.so.7'; dlerror: libnvinfer.so.7 while converting tf to trt

Description

I am trying to convert tensorflow model to trt fp16 graph but getting issue Could not load dynamic library ‘libnvinfer.so.7’; dlerror: libnvinfer.so.7

Environment

TensorRT Version: 8.0.1-1
GPU Type: dGPU
Nvidia Driver Version: 510.47.03
CUDA Version: cuda11.3
Operating System + Version: Ubuntu-18.04
Python Version (if applicable): 3.6.9
TensorFlow Version (if applicable): 2.6.2
Baremetal or Container (if container which image + tag): nvcr.io/nvidia/deepstream:6.1-devel

Codebase:

from tensorflow.python.compiler.tensorrt import trt_convert as trt
import tensorflow as tf
input_saved_model_dir = './inference_graph/saved_model/'
output_saved_model_dir = './inference_graph/trt_6_0/'

# converter = trt.TrtGraphConverterV2(input_saved_model_dir=input_saved_model_dir)
# converter.convert()
# converter.save(output_saved_model_dir)

# params = tf.experimental.tensorrt.ConversionParams(
#     precision_mode='FP16')
# converter = tf.experimental.tensorrt.Converter(
#     input_saved_model_dir=input_saved_model_dir, conversion_params=params)
# converter.convert()
# converter.save(output_saved_model_dir)

# converter = tf.experimental.tensorrt.Converter(input_saved_model_dir=input_saved_model_dir)
# converter.convert()
# converter.save(output_saved_model_dir)

# import tensorflow as tf
# from tensorflow.python.compiler.tensorrt import trt_convert as trt

# with tf.io.gfile.GFile('./inference_graph/frozen_inference_graph.pb', 'rb') as f:
#             graph_def = tf.GraphDef()
#             graph_def.ParseFromString(f.read())
# converter = trt.TrtGraphConverter(input_graph_def=graph_def)
# frozen_graph = converter.convert()
# converter.save(output_saved_model_dir)

conversion_params = trt.DEFAULT_TRT_CONVERSION_PARAMS
conversion_params = conversion_params._replace(precision_mode="FP16")
converter = trt.TrtGraphConverterV2(
    input_saved_model_dir=input_saved_model_dir,
    conversion_params=conversion_params)
converter.convert()
converter.save(output_saved_model_dir)

Error:

2022-08-16 11:43:41.691740: W tensorflow/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libnvinfer.so.7'; dlerror: libnvinfer.so.7: cannot open shared object file: No such file or directory; LD_LIBRARY_PATH: /usr/local/cuda-11.3/lib64:/usr/local/cuda/lib64:/usr/local/cuda-11.4/lib64:/usr/local/cuda/lib64:/usr/local/cuda-11.4/lib64:/usr/local/cuda/lib64:/usr/local/cuda-11.1/lib64:/usr/lib/x86_64-linux-gnu:/usr/lib/i386-linux-gnu:/usr/local/nvidia/lib:/usr/local/nvidia/lib64:/usr/local/nvidia/lib:/usr/local/nvidia/lib64
2022-08-16 11:43:41.691772: F tensorflow/compiler/tf2tensorrt/stub/nvinfer_stub.cc:49] getInferLibVersion symbol not found.

Thanks.

Hi,

This looks like a Deepstream related issue. We will move this post to the Deepstream forum.

Thanks!

From the description, seesm you used TensorRT Version: 8.0.1-1. if you want to deploy the engine into deepstream, make sure the build TRT version and deployed TRT version match.

Thanks @Amycao for reply.

Installation of tensorRT nv-tensorrt-repo-ubuntu1804-cuda11.1-trt7.2.2.3-ga-20201211_1-1_amd64 resolved the issue.

https://docs.nvidia.com/drive/drive-os-5.2.0.0L/drive-qsg-nv/trt-installation-instructions/index.html

Thanks.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.