Custom Parser deepstream 6.2 using python prorgramming for Yolov8

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) Jestson
• DeepStream Version : 6,2
• JetPack Version (valid for Jetson only) 5.2
• TensorRT Version
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs)
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

Hello All,

I have Yolov8 model and I am using deepstream 6.2 python and I want to build custom parser for Yolov8 using python programming

Can you help me to documentation for custom parser using python or give me steps here

Appreciate your feedback

Could you refer to the DeepStream-Yolo-Face first to see if it meets your needs?

inside config_infer_primary_yoloV8_face.txt

parse-bbox-instance-mask-func-name=NvDsInferParseYoloFace
custom-lib-path=nvdsinfer_custom_impl_Yolo_face/libnvdsinfer_custom_impl_Yolo_face.so
output-instance-mask=1

can i replace libnvdsinfer_custom_impl_Yolo_face.so c++ custom parser with
libnvdsinfer_custom_impl_Yolo_face.py python custom parser ?

so can we implement custom parser in python programming language?

Appreciate your feedback

No.

Yes. But this is more complicated. You need to get the output tensor data and parse that yourself. You can refer to the deepstream-ssd-parser to learn how to get the output tensor and parse that.

Thanks for your feedback

example you shared use custom parser in probe function but I need custom parser process output detections after yolov8 directly so can you give me direct example that will replace libnvdsinfer_custom_impl_Yolo_face.so with custom libnvdsinfer_custom_impl_Yolo_face.py

No. You cannot replace libnvdsinfer_custom_impl_Yolo_face.so with custom libnvdsinfer_custom_impl_Yolo_face.py. It’s used by the sources\libs\nvdsinfer\nvdsinfer_context_impl.cpp. You can refer to our source code.

    if (!string_empty(initParams.customLibPath))
    {
        std::unique_ptr<DlLibHandle> dlHandle =
            std::make_unique<DlLibHandle>(initParams.customLibPath, RTLD_LAZY);
        if (!dlHandle->isValid())
        {
            printError("Could not open custom lib: %s", dlerror());
            return NVDSINFER_CUSTOM_LIB_FAILED;
        }
        m_CustomLibHandle = std::move(dlHandle);
    }

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.