Detected Faces Tracker

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) Jetson
• DeepStream Version 6.2
• JetPack Version (valid for Jetson only)
• TensorRT Version 8.5.2.2
I’m trying to implement a face detection pipeline in DS which detect faces and then track these faces. I’m a beginner in DS and i want to know if there’s a face track sample or how to use the tracker plugin to track detected faces

There is a sample on face detection (using nvidia FaceDetect model ), but it doesn’t use tracker. You can refer to /opt/nvidia/deepstream/deepstream/sources/apps/sample_apps/deepstream-test2 on tracker usage, it is common for object detection models.

Thank you.
Is there a way to use the available .txt file to the detect model? or is it only designed for object tracking?
And can i find any documentation of how to link the tracker to the facedetect model

You can use the config file dstest2_tracker_config.txt in the deepstream-test2.
You may refer to the online doc for tracker: Gst-nvtracker — DeepStream 6.2 Release documentation (nvidia.com)

Is there a way to confirm if the tracker is working well in test-2. I have changed the config file to use face detection model but i couldn’t find a way to check if is it using the tracker or is it processing frame by frame.
Is there a way to find the FPS for test-2?

Yes. You can print the tracker information, please refer to the following structure: NvDsObjectMeta in sources\includes\nvdsmeta.h.

Please refer to the link below: https://forums.developer.nvidia.com/t/deepstream-sdk-faq/80236/12

I couldn’t understand how to use nvsmeta.h to check if tracker is working.
As well is there a way to add the FPS to python examples from deepstream

There are some tracker related variables in NvDsObjectMeta, like NvDsComp_BboxInfo tracker_bbox_info; gfloat tracker_confidence;.
For FPS in python, please refer to the demo code: https://github.com/NVIDIA-AI-IOT/deepstream_python_apps/blob/master/apps/deepstream-test3/deepstream_test_3.py#L35C1-L35C33.

I found that there are 3 different trackers libraries in DS (IOU, NvSORT, NvDFC). Is there a way to observe the difference between them from a sample test video?

Yes. You can change the config file. Please refer to our demo code: sources\apps\sample_apps\deepstream-test2\dstest2_tracker_config.txt.

# ll-config-file required to set different tracker types
# ll-config-file=../../../../samples/configs/deepstream-app/config_tracker_IOU.yml
# ll-config-file=../../../../samples/configs/deepstream-app/config_tracker_NvSORT.yml
ll-config-file=../../../../samples/configs/deepstream-app/config_tracker_NvDCF_perf.yml
# ll-config-file=../../../../samples/configs/deepstream-app/config_tracker_NvDCF_accuracy.yml
# ll-config-file=../../../../samples/configs/deepstream-app/config_tracker_NvDeepSORT.yml
1 Like

yes this part is clear. But is there a way to visualize the effect of using these different trackers from the output of a sample video or the difference is only in the hardware utilization.

No. Please refer to the posted Guide link above for their algorithms and differences.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.