Loading of TensorRT custom plugin shared library

I built a custom TRT plugin as a library and I now want to use it.
However I don’t see an API with which to tell TensorRT where to look for my custom plugin.
It works if a use dlopen() to load the library by hand, but I expected there would be a more civilized way to do it.
Still I am not able to find such a function in the API documentation.

Am I missing something, or indeed manual dlopen is indeed the correct way to do it?


Please refer to below links related custom plugin implementation and sample:

While IPluginV2 and IPluginV2Ext interfaces are still supported for backward compatibility with TensorRT 5.1 and 6.0.x respectively, however, we recommend that you write new plugins or refactor existing ones to target the IPluginV2DynamicExt or IPluginV2IOExt interfaces instead.



I have read those articles, but my solution isn’t there.
I don’t have issues implementing the plugin, the problem is loading it the right way.

What’s the canonical way to tell TensorRT where custom plugins reside?
Right now the only way I can load my plugin into the registry is if I manually call dlopen() on my library.
Is there a better way of achieving this?

I tried LD_PRELOAD, but it won’t work as in this case my library tries to load before nvinfer.so and it fails to find the getPluginRegistry() symbol.


We can load the shared object file containing the plugin implementation.
Please refer to the following sample on loading the custom plugin, hope this will help you.

Thank you.