Save inference result along with cropped detection in a local folder

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU)
Nano
• DeepStream Version
5.0
• JetPack Version (valid for Jetson only)
• TensorRT Version
7.1
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs)
question
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

Hi ,

I am trying to find a way to do the following :

Get Tiny-YoloV3 inference results of the a detection and save it as txt along with the jpeg cropped detection into a local folder folder

What is the best way to do it ?

zvika

What does “results” mean? BBOX?

For example : bbox coordinates, class_id

Hi @zvikas,
You can refe to deepstream_reference_apps/back_to_back_detectors.c at master · NVIDIA-AI-IOT/deepstream_reference_apps · GitHub to add a probe on OSD sink pad or GIE source to extra the bbox, calss_id information and them dump to txt file.

Thanks!

Hi mchi

I will definitely will check it

10x!