undefined reference to `createNvOnnxParserPluginFactory_INTERNAL'

I want to use the function createPluginFactory in my script on Jetson Nano

The tensorrt version on Jetson Nano is 5.1.6

below is my Cmakelist

find_path(TENSORRT_INCLUDE_DIR NvInfer.h
HINTS ${TENSORRT_ROOT} ${CUDA_TOOLKIT_ROOT_DIR}
PATH_SUFFIXES include/)

message(STATUS “Found TensorRT headers at ${TENSORRT_INCLUDE_DIR}”)
find_library(TENSORRT_LIBRARY_INFER nvinfer
HINTS ${TENSORRT_ROOT} ${TENSORRT_BUILD} ${CUDA_TOOLKIT_ROOT_DIR}
PATH_SUFFIXES lib lib64 lib/x64)
find_library(TENSORRT_LIBRARY_INFER_PLUGIN nvinfer_plugin
HINTS ${TENSORRT_ROOT} ${TENSORRT_BUILD} ${CUDA_TOOLKIT_ROOT_DIR}
PATH_SUFFIXES lib lib64 lib/x64)
find_library(TENSORRT_LIBRARY_PARSER nvparsers
HINTS ${TENSORRT_ROOT} ${TENSORRT_BUILD} ${CUDA_TOOLKIT_ROOT_DIR}
PATH_SUFFIXES lib lib64 lib/x64)

find_library(TENSORRT_LIBRARY_ONNXPARSER nvonnxparser
HINTS ${TENSORRT_ROOT} ${TENSORRT_BUILD} ${CUDA_TOOLKIT_ROOT_DIR}
PATH_SUFFIXES lib lib64 lib/x64)
set(TENSORRT_LIBRARY ${TENSORRT_LIBRARY_INFER} ${TENSORRT_LIBRARY_INFER_PLUGIN}
${TENSORRT_LIBRARY_PARSER} ${TENSORRT_LIBRARY_ONNXPARSER})
message(STATUS “Find TensorRT libs at ${TENSORRT_LIBRARY}”)

find_package_handle_standard_args(
TENSORRT DEFAULT_MSG TENSORRT_INCLUDE_DIR TENSORRT_LIBRARY)

if(NOT TENSORRT_FOUND)
message(ERROR
“Cannot find TensorRT library.”)
endif()

and the make error below

[ 5%] Building CXX object CMakeFiles/webcam_demo.dir/example/webcam_demo.cpp.o
[ 10%] Linking CXX executable webcam_demo
CMakeFiles/webcam_demo.dir/src/cttrt_Net.cpp.o: In function nvonnxparser::(anonymous namespace)::createPluginFactory(nvinfer1::ILogger&)': cttrt_Net.cpp:(.text+0x154): undefined reference to createNvOnnxParserPluginFactory_INTERNAL’
collect2: error: ld returned 1 exit status
CMakeFiles/webcam_demo.dir/build.make:1041: recipe for target ‘webcam_demo’ failed
make[2]: *** [webcam_demo] Error 1
CMakeFiles/Makefile2:77: recipe for target ‘CMakeFiles/webcam_demo.dir/all’ failed
make[1]: *** [CMakeFiles/webcam_demo.dir/all] Error 2
Makefile:83: recipe for target ‘all’ failed
make: *** [all] Error 2


any advice ?

Moving this to the Jetson Nano forum so the Jetson team can take a look.

Meanwhile, please check if all the dependencies are installed correctly.

TensorRT was installed initially on Jetson Nano image

The version is 5.1.6

hello 1054399357,

may I know which JetPack release you’re working with.
you should also refer to a deep learning beginner tutorial, please check jetson-inference github
thanks

Hi Jerry Chang

JetPack is 4.2

I have checked the link you provided, however it doesn’t give any help

the problem seems like “createNvOnnxParserPluginFactory_INTERNAL” function seems did not properly link to the nvonnxparser_runtime.cpp ?

I am not so sure…

solve the problem

just add below

find_library(TRT_ONNXPARSER_RUNTIME libnvonnxparser_runtime.so
        HINTS  /usr/lib/aarch64-linux-gnu/)

so that i can use the function properly

Thanks !