How to deploy caffe parser

I followed the instructions on Github to compile and install and the sample googlenet passed. However it doesn’t seem to use the compiled code. I just print to the console to test it.

The binary lib has:*

The compiled lib has:*

One has the “_” and the other doesn’t.

Please shed some lights.


It seems to be due to ENV variable setting issue.
Could you please check if “LD_LIBRARY_PATH” and other ENV variables has been updated to point to new release/code.


Also, check the installed TRT packages using below command:

dpkg -l | grep TensorRT



The LD_LIBRARY_PATH is set correctly.
I followed the instructions here: which uses tar package instead of deb.
My question was why the compiled lib name is different than the binary version? Is the additional “_” intentional or that’s actually the issue I’m having?

On your installation guide it says:

The library functionality from previous versions is included in since TensorRT 5.0. The installed symbolic link for is updated to point to the new library. The static library libnvcaffe_parser.a is also symbolically linked to libnvparsers_static.a.

However, when I built from source. The /build/out directory only contains these caffe parser files:

I can’t figure out how to overwrite the caffeparser in the lib.


Can you provide the following information so we can better help?
Provide details on the platforms you are using:
o Linux distro and version
o GPU type
o Nvidia driver version
o CUDA version
o CUDNN version
o Python version [if using python]
o Tensorflow and PyTorch version
o TensorRT version


Linux distro and version: Ubuntu 18.04.3 LTS

GPU type: GeForce RTX 2080 Ti

Nvidia driver version: 418.87.00

CUDA version: release 10.1, V10.1.243

CUDNN version: 7.5.0

Python version: 3.6.9

Tensorflow version: 1.15.0

TensorRT version:

So I’m trying to apply the patch to CaffeParser:

I use TensorRT Python API to build the cuda engine, but my changes doesn’t take effect.


Can you retry fresh installation using the latest TRT version?


I tried the latest version and the changes still doesn’t take effort when I use Python API. C++ API seems to work. Is this expected? Do you have any workaround?


Could you please try your application directly on TRT 7 without applying the patch?

If issue persist, could you please share the script and model file to reproduce the issue so we can help better?