Get class_probabilities_map from segmentation task

Hello everyone,

I have a custom model generate heatmap, the goal is to get the raw output tensor and visualize it using Python Binding, but I’m currently stuck in how to get the raw output tensor.

INFO: [Implicit Engine Info]: layers num: 2
0   INPUT  kFLOAT input           3x512x512       <= This is input tensor shape
1   OUTPUT kFLOAT output          1x64x64       <= This is output tensor shape

As this model is similar to segmentation, so I’s trying to get raw output from class_probabilities_map in NVDSINFER_SEGMENTATION_META.

My first step is to get frame_user_meta_list and cast it to NvDsUserMeta like below:

l_user = frame_meta.frame_user_meta_list
user_meta = pyds.NvDsUserMeta.cast(l_user.data)

After that, I skip the user_meta that does not belong to NVDSINFER_SEGMENTATION_META by:

    if (
            user_meta.base_meta.meta_type
            != pyds.NvDsMetaType.NVDSINFER_SEGMENTATION_META
    ):
        continue

But unfortunately, I’m not able to get the class_probabilities_map either from user_meta.class_probabilities_map nor from user_meta.base_meta.class_probabilities_map?

Could someone tell me how to get the class_probabilities_map?

Any advice is appreciated :)

• Hardware Platform (Jetson)
• DeepStream Version (V 5.0)
• JetPack Version (V4.4)
• TensorRT Version (V 7.1.3)
• Issue Type (questions)

Hi @hwang2uhzs,
Sorry for delay since the team were in holiday in the past 8 days!

Maybe you can refer to sample - https://github.com/NVIDIA-AI-IOT/deepstream_python_apps/tree/master/apps/deepstream-ssd-parser

  1. Register a pgie srcpad probe function
    pgiesrcpad.add_probe(Gst.PadProbeType.BUFFER, pgie_src_pad_buffer_probe, 0)
  2. In srcpad probe function, call nvds_infer_parse_custom_tf_ssd(layers_info, detection_params, box_size_param, nms_param), here layers_info is NvDsInferLayerInfo , in which property buffer is Pointer to the buffer for the layer data. So, the data pointed by buffer is the raw output data of the network.
1 Like

Thank you very much for the reply. @mchi

Really appreciate that, I’ll give it a try and maybe ask you some questions later!