• Hardware Platform (GPU)
• DeepStream Version 5.0
• TensorRT Version 7.2.1-1
• NVIDIA GPU Driver Version (valid for GPU only) 450.102.04
• Issue Type( questions)
• I am facing problem extracting custom onnx NN tensor output.
I wanted to do pgie infer and get bounding box for an object, then I want to do sgie infer with in that bbox and get inference result. I could get the NvDsInferTensorMeta from obj_meta->obj_user_meta_list. I could see the meta data unique id and output layers info like this:
NvDsInferTensorMeta *meta = (NvDsInferTensorMeta *)user_meta->user_meta_data;
std::cout << "meta->unique_id=" << meta->unique_id << std::endl;
std::cout << "meta->num_output_layers=" << meta->num_output_layers << std::endl;
std::cout << "meta->output_layers_info[0].inferDims.numElements: " << meta->output_layers_info[0].inferDims.numElements << std::endl;
std::cout << "meta->output_layers_info[0].layer_name: " << meta->output_layers_info[0].layerName << std::endl;
But I could not access the data using this:
float *output_data = (float *)meta->output_layers_info[0].buffer;
I need to access these data to do the parsing. Is there any other way to access the Infer Tensor meta data?
TIA