Computing Accuracy of a Model?

Hello Guys
I was looking at the configuration files of deep stream and was interested to know if there is any plugin to measure accuracy or(measuring True Positives and False Negatives) of a classifier or a tracker or a custom that some one has built.Lets say I want to compute how many objects were correctly identified by my classifier. Is there any way to get the performance metric ?

Please provide complete information as applicable to your setup.
• Hardware Platform (Jetson / GPU)
• DeepStream Version
• JetPack Version (valid for Jetson only)
• TensorRT Version
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs)
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

no this kind of plugin, you can compare the true data with deepstream outputs, please refer to all plugins NVIDIA DeepStream SDK Developer Guide — DeepStream 6.1 Release documentation
deepstream ,

There is no update from you for a period, assuming this is not an issue anymore.
Hence we are closing this topic. If need further support, please open a new one.
Thanks