Please provide complete information as applicable to your setup.
• Hardware Platform (Jetson / GPU) Jetson
• DeepStream Version 6.2
• JetPack Version (valid for Jetson only)
• TensorRT Version 184.108.40.206
I’m trying to implement a face detection pipeline in DS which detect faces and then track these faces. I’m a beginner in DS and i want to know if there’s a face track sample or how to use the tracker plugin to track detected faces
There is a sample on face detection (using nvidia FaceDetect model ), but it doesn’t use tracker. You can refer to /opt/nvidia/deepstream/deepstream/sources/apps/sample_apps/deepstream-test2 on tracker usage, it is common for object detection models.
Is there a way to use the available .txt file to the detect model? or is it only designed for object tracking?
And can i find any documentation of how to link the tracker to the facedetect model
You can use the config file dstest2_tracker_config.txt in the deepstream-test2.
You may refer to the online doc for tracker: Gst-nvtracker — DeepStream 6.2 Release documentation (nvidia.com)
Is there a way to confirm if the tracker is working well in test-2. I have changed the config file to use face detection model but i couldn’t find a way to check if is it using the tracker or is it processing frame by frame.
Is there a way to find the FPS for test-2?
Yes. You can print the tracker information, please refer to the following structure:
Please refer to the link below: https://forums.developer.nvidia.com/t/deepstream-sdk-faq/80236/12
I couldn’t understand how to use nvsmeta.h to check if tracker is working.
As well is there a way to add the FPS to python examples from deepstream
There are some tracker related variables in
NvDsComp_BboxInfo tracker_bbox_info; gfloat tracker_confidence;.
For FPS in python, please refer to the demo code: https://github.com/NVIDIA-AI-IOT/deepstream_python_apps/blob/master/apps/deepstream-test3/deepstream_test_3.py#L35C1-L35C33.
I found that there are 3 different trackers libraries in DS (IOU, NvSORT, NvDFC). Is there a way to observe the difference between them from a sample test video?
Yes. You can change the config file. Please refer to our demo code:
# ll-config-file required to set different tracker types
yes this part is clear. But is there a way to visualize the effect of using these different trackers from the output of a sample video or the difference is only in the hardware utilization.
No. Please refer to the posted Guide link above for their algorithms and differences.
This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.