I’m wondering if it is possible to use the TensorRT C++ library (libnvinfer) by loading it at runtime with dlopen instead of linking at compile time. I need this because TensorRT would be an optional component loaded by the software on request.
As far as I know, the documentation doesn’t mention a way to do it. If possible, it would be nice to have some reference, examples or advice on how to do that!
In particular, do I necessarily need to create a wrapper library to access the class methods? Is there any straightforward way to use the TensorRT classes in my software without linking it?
I think you are referring to the libnvinfer.so.xx file? So like all dynamically linked shared object libraries (.so): libninfer is dynamically linked at run time. The library must be available during compile/link phase. The shared objects are not included into the executable component but are tied to the execution.
Sure, I’m referring to libnvinfer.so.xx. Do you mean it wouldn’t be possible to load the .so file optionally at runtime, without making the software depend on libnvinfer?
This already happens for the python wrapper. Is there an easy way to do it with C++ too?
This is not a TRT specific issue. Seems to be general C++ question regarding loading shared libraries and classes dynamically at runtime. C++ has no support for having symbols that are dynamically resolved instead of linked in normally. It has to be implemented manually via macros or use the experimental DynamicLibrary to simplify that a bit.