How to drop old frames instead buffering too much in low spec Jetson Nano 2g device

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU)
Jetson Nano 2g.
• DeepStream Version
DS 6
• JetPack Version (valid for Jetson only)
• TensorRT Version
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs)
questions
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

I’ve retrained a model from detectnet_v2 (resnet 18) by TAO, and it’s running well on my Jetson Nano 2g with file source, the app is a customized Python app (I modified it from official Python sample app: test4), it can achieve 6FPS inference perf.

But see problems when switch the source to a live camera RTSP stream (25fps, 1280*960), the huge time (minutes) lag happened between the inferenced osd and camera live monitor.

I can understand the root cause is the poor perf of the hardware that can’t achieve real-time inference, and may never can’t.

So I wonder is there a way in Python code to drop old frames rather than buffering those un-inferenced frames too much?

Please refer to Troubleshooting — DeepStream 6.0 Release documentation

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.