No custom_parse_bbox_func in nvinfer

I am implementing custom_parse_bbox_func in AGXXavier with Jetpack4.4.

Can’t find custom_parse_bbox_func.

So how to parse custom function in object detection?

In configuration file,
network-type=0
num-detected-classes=48
custom_parse_bbox_func=NvDsInferParseCustomCTCGreedy
custom_lib=/opt/nvidia/deepstream/deepstream-5.0/sources/libs/nvdsinfer_customparser/libnvds_infercustomparser.so

But the default resnet parsing function is always called.

Then when I changed to

parse-bbox-func-name=NvDsInferParseCustomCTCGreedy
parse-bbox-lib-name=/opt/nvidia/deepstream/deepstream-5.0/lib/libnvds_infercustomparser.so

I have errors as

0:00:03.346315954 15786   0x55b6e14490 ERROR                nvinfer gstnvinfer.cpp:613:gst_nvinfer_logger:<secondary1-nvinference-engine> NvDsInferContext[UID 2]: Error in NvDsInferContextImpl::initResource() <nvdsinfer_context_impl.cpp:681> [UID = 2]: Detect-postprocessor failed to init resource because dlsym failed to get func NvDsInferParseCTCGreedy pointer
ERROR: Infer Context failed to initialize post-processing resource, nvinfer error:NVDSINFER_CUSTOM_LIB_FAILED
ERROR: Infer Context prepare postprocessing resource failed., nvinfer error:NVDSINFER_CUSTOM_LIB_FAILED
0:00:03.349646996 15786   0x55b6e14490 WARN                 nvinfer gstnvinfer.cpp:809:gst_nvinfer_start:<secondary1-nvinference-engine> error: Failed to create NvDsInferContext instance
0:00:03.349746200 15786   0x55b6e14490 WARN                 nvinfer gstnvinfer.cpp:809:gst_nvinfer_start:<secondary1-nvinference-engine> error: Config file path: dstest2_sgie1_config.txt, NvDsInfer Error: NVDSINFER_CUSTOM_LIB_FAILED

I tried custom classifier function implementation as

network-type=1
num-detected-classes=48
parse-classifier-func-name=NvDsInferClassiferCTCGreedyParser

also failed with

0:00:03.096188290 16232   0x5585a12c90 ERROR                nvinfer gstnvinfer.cpp:613:gst_nvinfer_logger:<secondary1-nvinference-engine> NvDsInferContext[UID 2]: Error in NvDsInferContextImpl::initResource() <nvdsinfer_context_impl.cpp:809> [UID = 2]: Failed to init classify-postprocessor because dlsym failed to get func NvDsInferClassiferCTCGreedyParser pointer
ERROR: Infer Context failed to initialize post-processing resource, nvinfer error:NVDSINFER_CUSTOM_LIB_FAILED
ERROR: Infer Context prepare postprocessing resource failed., nvinfer error:NVDSINFER_CUSTOM_LIB_FAILED

I put CHECK_CUSTOM_CLASSIFIER_PARSE_FUNC_PROTOTYPE(NvDsInferClassiferCTCGreedyParser); for custom classifier and
CHECK_CUSTOM_PARSE_FUNC_PROTOTYPE(NvDsInferParseCustomCTCGreedy); for detector inside nvdsinfer_custombboxparser.cpp file and nvdsinfer_customclassifierparser.cpp file.

Any suggestions for this. I need to finish this custom output implementation in this week.

Tested all provided custom classifier and detection parser, all can’t be loaded.

Is it Deepstream has some issues or do I have any missing in implementation?

Found out that deepstream-app can load

parse-bbox-func-name=NvDsInferParseCustomResnet
custom-lib-path=libnvds_infercustomparser.so

But I can’t load in my application.
This deepstream-app is hard to understand.
Can somebody help how this deepstream-app can load custom function?

Now I know. I have
custom-lib-path=/usr/src/tensorrt/CTCGreedyDecoder_Plugin/build/libCTCGreedyDecoder.so
used for my custom layer plugin.

parse-bbox-func-name=NvDsInferParseCustomCTCGreedy

parse-bbox-func-name’s custom parsing also looked for the function in the same lib and can’t find.

How can I set library path if I have both custom layer plugin using IPluginV2DynamicExt and custom parser function for parsing network output?

Solved the issue by putting custom parser inside custom layer plugin and make a lib for both. Solved.