I wonder how metadata is passed to the tracker or secondary inference engine after primary inference

hello, I’m working on deepstream

After secondary inference, metadata is stored and transmitted in various ways, but I am curious how the metadata is delivered to the tracker or secondary inference engine after primary inference

If i look at the config file, the second inference gets the id of the first inference, but I’m curious how to get the metadata in detail.

thank you

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU)
• DeepStream Version
• JetPack Version (valid for Jetson only)
• TensorRT Version
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs)
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

You can refer the gstnvinfer.cpp → gst_nvinfer_process_objects

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.