Secondary inference using IInferCustomProcessor: How to access to ObjectMeta?

Please provide complete information as applicable to your setup.

**• Hardware Platform (Jetson / GPU) : dGPU
**• DeepStream Version : 6.0.1
**• JetPack Version (valid for Jetson only) : None
**• TensorRT Version : 8.0.1
**• NVIDIA GPU Driver Version (valid for GPU only) : 495.29.05
**• Issue Type( questions, new requirements, bugs) : questions
**• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing) : None
**• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description) : None

I want to implement secondary-inference and metadata-attaching using IInferCustomProcessor(nvinferserver),
but I can not access to ObjectMeta in inferenceDone().

I checked nvdsinferserver_custom_process.cpp but couldn’t figure out how.

I found OPTION_NVDS_OBJ_META_LIST in the header but
inOptions->hasValue(OPTION_NVDS_OBJ_META_LIST) returns false in my experiment.

It seems that inOptions->getValueArray(OPTION_NVDS_FRAME_META_LIST, frameMetaList) returns ObjectMeta* instead of FrameMeta* in the case of secondary inference.

I can access to ObjectMeta and my problem is solved for now.

But this behavior is of an unexpected kind.
It would be nice if the behavior were documented or corrected.

Glad to know issue fixed, will forward the comments to internal to improve it. Thanks

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.