Need custom resnet18 parser code

• Hardware Platform (Jetson / GPU) T4
• DeepStream Version 6.1
• JetPack Version (valid for Jetson only)
• TensorRT Version 11.4
• NVIDIA GPU Driver Version (valid for GPU only)

im using reference of deepstream_python_apps/apps/deepstream-ssd-parser at master · NVIDIA-AI-IOT/deepstream_python_apps · GitHub

You can refer to this for resnet18 output layer parsing under
sources/libs/nvdsinfer_customparser/

after buidling the parser, im getting below error during inference


Warning: gst-library-error-quark: Configuration file batch-size reset to: 1 (5): gstnvinferserver_impl.cpp(287): validatePluginConfig (): /GstPipeline:pipeline0/GstNvInferServer:primary-inference
Could not find bbox layer buffer while parsing
ERROR: infer_postprocess.cpp:1043 Failed to parse bboxes using custom parse function
ERROR: infer_postprocess.cpp:378 detection parsing output tensor data failed, uid:5, nvinfer error:NVDSINFER_CUSTOM_LIB_FAILED
ERROR: infer_postprocess.cpp:265 Infer context initialize inference info failed, nvinfer error:NVDSINFER_CUSTOM_LIB_FAILED

Did you change these properties in infer configuration file?
parse-classifier-func-name=NvDsInferClassiferParseCustomSoftmax
custom-lib-path=/path/to/this/directory/libnvds_infercustomparser.so

There is no update from you for a period, assuming this is not an issue anymore.
Hence we are closing this topic. If need further support, please open a new one.
Thanks

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.