Problem with REGISTER_TENSORRT_PLUGIN

Please provide the following info (check/uncheck the boxes after creating this topic):
Software Version
DRIVE OS Linux 5.2.6
DRIVE OS Linux 5.2.6 and DriveWorks 4.0
DRIVE OS Linux 5.2.0
DRIVE OS Linux 5.2.0 and DriveWorks 3.5
NVIDIA DRIVE™ Software 10.0 (Linux)
NVIDIA DRIVE™ Software 9.0 (Linux)
other DRIVE OS version
other

Target Operating System
Linux
QNX
other

Hardware Platform
NVIDIA DRIVE™ AGX Xavier DevKit (E3550)
NVIDIA DRIVE™ AGX Pegasus DevKit (E3550)
other

SDK Manager Version
1.9.10816
other

Host Machine Version
native Ubuntu 18.04
other

Hi,
I have a problem with build own plugin (ResizeNearest) to tensorRT (tensorrt 5.1.4) -"undefined reference to symbol ‘getPluginRegistry’ ".
When I add line:
REGISTER_TENSORRT_PLUGIN(ResizeNearestPluginCreator);
My output in cross-compile is:

/home/USER/nvidia/nvidia_sdk/DRIVE_Software_10.0_Linux_OS_DRIVE_AGX_PEGASUS_XT/DRIVEOS/toolchains/gcc-linaro-7.3.1-2018.05-x86_64_aarch64-linux-gnu/bin/../lib/gcc/aarch64-linux-gnu/7.3.1/../../../../aarch64-linux-gnu/bin/ld: CMakeFiles/sample_integrate_dnn.dir/main.cpp.o: undefined reference to symbol 'getPluginRegistry'
/usr/local/driveworks-2.2/targets/aarch64-Linux/lib/libnvinfer.so.5: error adding symbols: DSO missing from command line
collect2: error: ld returned 1 exit status
src/dnn/sample_integrate_dnn/CMakeFiles/sample_integrate_dnn.dir/build.make:370: recipe for target 'src/dnn/sample_integrate_dnn/sample_integrate_dnn' failed
make[2]: *** [src/dnn/sample_integrate_dnn/sample_integrate_dnn] Error 1
CMakeFiles/Makefile2:2825: recipe for target 'src/dnn/sample_integrate_dnn/CMakeFiles/sample_integrate_dnn.dir/all' failed
make[1]: *** [src/dnn/sample_integrate_dnn/CMakeFiles/sample_integrate_dnn.dir/all] Error 2
Makefile:129: recipe for target 'all' failed
make: *** [all] Error 2

How is the name of library which has getPluginRegistry method? I found docs for NvInferRuntime.h which has getPluginRegistry ( TensorRT: NvInferRuntime.h File Reference ), but it is not implemented in tensorRT 5.1.4 (Support for older TRT 5.1.4 · Issue #49 · NVIDIA-AI-IOT/CUDA-PointPillars · GitHub)

Or how implement new plugin ResizeNearest in tensorRT 5.1.4 without REGISTER_TENSORRT_PLUGIN?
I try register plugin with example code

// Look up the plugin in the registry 
auto creator = getPluginRegistry()->getPluginCreator(pluginName, pluginVersion); 
const PluginFieldCollection* pluginFC = creator->getFieldNames(); 
// Populate the fields parameters for the plugin layer 
// PluginFieldCollection *pluginData = parseAndFillFields(pluginFC, layerFields); 
// Create the plugin object using the layerName and the plugin meta data 
IPluginV2 *pluginObj = creator->createPlugin(layerName, pluginData); 
// Add the plugin to the TensorRT network auto layer = network.addPluginV2(&inputs[0], int(inputs.size()), pluginObj); 
… (build rest of the network and serialize engine) 
// Destroy the plugin object 
pluginObj->destroy() 
… (free allocated pluginData)

(Developer Guide :: NVIDIA Deep Learning TensorRT Documentation)
but I don’t know where and how is “network” and “input”

Dear @Matthew_Pimot,
May I know if you are stick to DRIVE SW 10 or you can move to latest release i.e DRIVE OS 5.2.6?
Please see Using bin-file after TensorRT optimization if it helps.

I’d rather stay to DRIVE SW 10, becacuse in driveworks-2.2 are drivenet package and others, but if DRIVE OS 5.2.6 is the only solution, that I can upgrade DRIVE AGX to test.

Dear @Matthew_Pimot,
If your application requires to use of DRIVE DNN modules(like Drivenet), you need to stay on DRIVE SW 10.0 release. In that case, please check sample_dnn_plugin sample code to know how to integrate a layer as plugin.

If your application is based on your custom model, we would encourage you to use DRIVE OS 5.2.6 + DW 4.0 as it has new features compared to DRIVE SW 10.0 (like native support for ResizeNearest layer).

Dear @ SivaRamaKrishnaNV,
I check sample_dnn_plugin sample code, but in this sample I don’t saw the registry plugin:

REGISTER_TENSORRT_PLUGIN(...);

or create plugin:

auto creator = getPluginRegistry()->getPluginCreator("ResizeNearest", "001");
const nvinfer1::PluginFieldCollection* pluginFC = creator->getFieldNames();
nvinfer1::IPluginV2 *pluginObj = creator->createPlugin("086_upsample", pluginFC);

If I comment pluginPath in main and project(dnn_fc_layer_plugin C CXX) in CMakeLists.txt and use:

CHECK_DW_ERROR(dwDNN_initializeTensorRTFromFileNew(&m_dnn, tensorRTModel.c_str(), nullptr,
                                                               DW_PROCESSOR_TYPE_GPU, m_sdk));

I have error: Seriallized engine contains plugin, but no plugin factory was provided.
If I want use Yolo without plugin I have error:

getPluginCreator could not find plugin ResizeNearest version 001 namespace

If I want add plugin but I don’t understand how.
Thanks for help!

If I use other example: GitHub - NVIDIA/DL4AGX: Deep Learning tools and applications for NVIDIA AGX platforms.
I see error in line: REGISTER_TENSORRT_PLUGIN(…);
undefined reference to symbol ‘getPluginRegistry’
/usr/local/driveworks-2.2/targets/aarch64-Linux/lib/libnvinfer.so.5: error adding symbols: DSO missing from command line

It is posibble to use getPluginRegistry in Tensorrt 5.1.4?