Please provide the following info (check/uncheck the boxes after creating this topic):
Software Version
DRIVE OS Linux 5.2.6
DRIVE OS Linux 5.2.6 and DriveWorks 4.0
DRIVE OS Linux 5.2.0
DRIVE OS Linux 5.2.0 and DriveWorks 3.5
NVIDIA DRIVE™ Software 10.0 (Linux)
NVIDIA DRIVE™ Software 9.0 (Linux)
other DRIVE OS version
other
Target Operating System
Linux
QNX
other
Hardware Platform
NVIDIA DRIVE™ AGX Xavier DevKit (E3550)
NVIDIA DRIVE™ AGX Pegasus DevKit (E3550)
other
SDK Manager Version
1.9.10816
other
Host Machine Version
native Ubuntu 18.04
other
Hi,
I have a problem with build own plugin (ResizeNearest) to tensorRT (tensorrt 5.1.4) -"undefined reference to symbol ‘getPluginRegistry’ ".
When I add line:
REGISTER_TENSORRT_PLUGIN(ResizeNearestPluginCreator);
My output in cross-compile is:
/home/USER/nvidia/nvidia_sdk/DRIVE_Software_10.0_Linux_OS_DRIVE_AGX_PEGASUS_XT/DRIVEOS/toolchains/gcc-linaro-7.3.1-2018.05-x86_64_aarch64-linux-gnu/bin/../lib/gcc/aarch64-linux-gnu/7.3.1/../../../../aarch64-linux-gnu/bin/ld: CMakeFiles/sample_integrate_dnn.dir/main.cpp.o: undefined reference to symbol 'getPluginRegistry'
/usr/local/driveworks-2.2/targets/aarch64-Linux/lib/libnvinfer.so.5: error adding symbols: DSO missing from command line
collect2: error: ld returned 1 exit status
src/dnn/sample_integrate_dnn/CMakeFiles/sample_integrate_dnn.dir/build.make:370: recipe for target 'src/dnn/sample_integrate_dnn/sample_integrate_dnn' failed
make[2]: *** [src/dnn/sample_integrate_dnn/sample_integrate_dnn] Error 1
CMakeFiles/Makefile2:2825: recipe for target 'src/dnn/sample_integrate_dnn/CMakeFiles/sample_integrate_dnn.dir/all' failed
make[1]: *** [src/dnn/sample_integrate_dnn/CMakeFiles/sample_integrate_dnn.dir/all] Error 2
Makefile:129: recipe for target 'all' failed
make: *** [all] Error 2
How is the name of library which has getPluginRegistry method? I found docs for NvInferRuntime.h which has getPluginRegistry ( TensorRT: NvInferRuntime.h File Reference ), but it is not implemented in tensorRT 5.1.4 (Support for older TRT 5.1.4 · Issue #49 · NVIDIA-AI-IOT/CUDA-PointPillars · GitHub)
Or how implement new plugin ResizeNearest in tensorRT 5.1.4 without REGISTER_TENSORRT_PLUGIN?
I try register plugin with example code
// Look up the plugin in the registry
auto creator = getPluginRegistry()->getPluginCreator(pluginName, pluginVersion);
const PluginFieldCollection* pluginFC = creator->getFieldNames();
// Populate the fields parameters for the plugin layer
// PluginFieldCollection *pluginData = parseAndFillFields(pluginFC, layerFields);
// Create the plugin object using the layerName and the plugin meta data
IPluginV2 *pluginObj = creator->createPlugin(layerName, pluginData);
// Add the plugin to the TensorRT network auto layer = network.addPluginV2(&inputs[0], int(inputs.size()), pluginObj);
… (build rest of the network and serialize engine)
// Destroy the plugin object
pluginObj->destroy()
… (free allocated pluginData)
(Developer Guide :: NVIDIA Deep Learning TensorRT Documentation)
but I don’t know where and how is “network” and “input”