How to access to nvanalytics direction detection features for DeepStream python

• Hardware Platform : Jetson AGX Xavier
• DeepStream Version : 5.0.1 / gstreamer 1.14.5
• JetPack Version : 4.3
• TensorRT Version : 7.1.3
• NVIDIA GPU Driver Version : 10.2
• Issue Type : questions

Hello.

I wonder how I can utilize the “direction detection” feature offered by nvanalytics. When the direction is set, the direction of the detection + tracking object is displayed on the screen well.

I realized that if an object is not moving, the label “stopped” is displayed.

Could you explain what is the exact condition for DeepStream to recognize an object is “stopped”. And is it possible for us to customize this condition? For example, if we want to tell an object is stopped only when it stayes there for more than 5 secs. Thank you so much in advance.

+) For line crossing and ROI, they are based on the bottom center of objects. are we able to change the criteria for them as well? For example, from bottom center to (xmin, ymin).
(I’m using DeepStream python)

Hey, could you refer deepstream_python_apps/apps/deepstream-nvdsanalytics at master · NVIDIA-AI-IOT/deepstream_python_apps · GitHub

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.