How to output inference information in deepstream-app

Please provide complete information as applicable to your setup.

**• Hardware Platform (Jetson / GPU)jetson nano
**• DeepStream Version5.1
**• JetPack Version (valid for Jetson only)4.6.1
**• TensorRT Version7.1.3
**• Issue Type( questions, new requirements, bugs)new requirements
**• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)
The directory used is /opt/nvidia/deepstream/deepstream-5.1/sources/apps/sample_apps/deepstream-app

I now have an image classification model and I use the configuration file to achieve the current functionality.

I want to get the inference information for each image, the source-ID of the image, how can I modify deepStream_app.c?

I set output-tensor-meta to true in the config file

source-id can be retrieved from NvDsFrameMeta-> source_id. If you only want to get meta data, a more convinient way is to run/adapt deepstream-test1 (C/C++ Sample Apps Source Details — DeepStream 6.1.1 Release documentation)

There is no update from you for a period, assuming this is not an issue anymore.
Hence we are closing this topic. If need further support, please open a new one.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.