Bodyposenet from nvidia NGC deployed in Deepstream 6.0 does not return inference output

We have an application built using python on top of deepstream 6.0 for performing object detection and tracking. After successfully integrating bodyposenet from

wget --content-disposition https://api.ngc.nvidia.com/v2/models/nvidia/tao/bodyposenet/versions/deployable_v1.0.1/zip -O bodyposenet_deployable_v1.0.1.zip

nvidia NGC, PFA for the config file.

While running the deepstream pipeline and looking at the NvDsFrameMeta object, the bInferDone equals to 0 , obj_meta_list isNone and other values related to inference are empty or None.

Why would this happen, please go through the thread and tell me possible solutions ?

bodypose2d_pgie_config.conf (3.0 KB)

noticed network-type=100 in the cfg, you need to parse output , set bInferDone to 1 after adding meta yourself, please refer to attach_metadata_detector.

Where should I look at to get the model output and parse it ?

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks

please refer to deepsream SDK sample deepstream-infer-tensor-meta-test, there is network-type=100 in configuration file, parsing function is in function pgie_pad_buffer_probe.
here is a bodypose2d sample: deepstream_tao_apps/apps/tao_others/deepstream-bodypose2d-app at master · NVIDIA-AI-IOT/deepstream_tao_apps · GitHub

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.