How to use secondary detector with deepstream 5.0 python apps

Hello, I am using Deepsteam 5.0 (python apps) with my custom yolov3 tiny weights to detect objects from a video with the primary detector. my question is how can i use secondary detector and what changes has to be made in python program to run secondary detector which is a yolo model.

thanks in Advance

Moving to DeepStream SDK forum for resolution.

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU)
• DeepStream Version
• JetPack Version (valid for Jetson only)
• TensorRT Version
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs)
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

Platform - Jetson Nano
Deepstream SDK - 5.0
TensorRT - 7+
I know back to back detector is used for secondary detection in a C++ example. how to use back to back detectors in python. and why is back to back detector example so slow on Jetson nano? or am I doing something wrong.

Refer to Is threre a python implementation of back-to-back detector? in deepstream 5.0 - #5 by bcao

What do you mean slow? 1 fps?

Yes 2-3 fps at max. It stutters once it finds a car and then again stutters to find a numberplate. And why can I only pass a .h264 file and not a MP4 file

refer to GitHub - NVIDIA-AI-IOT/deepstream_tao_apps: Sample apps to demonstrate how to deploy models trained with TAO on DeepStream to measure the model infernce perf .
Did you boost NANO clock with below two commands before measuring the perf?
sudo nvpmodel -m 0
sduo jetson_clocks

to support mp4, you need to add “qtdemux” between filesrc and h264(5)parser, e.g.

gst-launch-1.0 filesrc location=“a.mp4” ! qtdemux ! h264parse ! nvv4l2decoder …

1 Like

Thanks, I’ll do that.