How to recompile libnvparsers

Hello everyone,

I need to apply this fix to TensorRT version 6.0.1.5:
https://github.com/NVIDIA/TensorRT/issues/179

But, when I compile the open source branch “6.0.1.5” from github, the resulted libraries only include libnvcaffeparser, not libnvparsers.
Is there any way I can recompile and apply the changes to libnvparsers?

Thanks in advance!

Hi,

It seems since UFF parser is not part of TRT OSS, libnvparsers is not getting generated.
I think you should be able to use the caffe parser changes using libnvcaffeparser.

Please let us know if you face any issues.

Thanks

Thanks a lot for your reply. Yeah, I’m able to use libnvcaffeparser with TRT’s C++ API but I’m unable to use it with Python API as it seems to call libnvparsers. Is there any way to make TRT’s Python to call libnvcaffeparser, instead of libnvparsers?

Hi,

It seems your system might be having two binary versions: opensourced and binary released.
It is possible that the python still load the binary released one.

Thanks

Hi,

Thank you for your reply. As far as I understand currently, we can’t build the Python wheel from the opensourced TRT. We can only use the one in binary released, right? That Python wheel seems to load libnvparsers from the binary released only and I don’t know how to make it load libnvcaffeparser from the opensourced TRT instead.

Hi,

Sorry for late reply.
Since the Python bindings are not open-sourced, the lib for python API is not getting updated changes.
Currently in TRT 7 you can only use C++ API with custom OSS changes lib.

Thanks

Hi,

It’s OK. Thanks for your reply. I converted my model to ONNX and found that TensorRT’s ONNX parser works very well for both C++ and Python bindings. ONNX seems to be much better supported than the old Caffe now. By the way, is there any plan to opensource more Nvida’s libraries such as TensorRT or DeepStream?

Hi,
I don’t have any info regarding what all lib are planned to be open-sourced.
Will request to stay tuned for any updates on Nvidia website/forum.

Thanks