Deepstream cumulative delay

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU): Jetson Nano
• DeepStream Version: 5
• JetPack Version (valid for Jetson only) 4.4

Hello everyone, I write for the following. I am developing an application with deepstream based on the test applications 3 and 4 + analytics. This application sends an MQTT message every 250 frames processed. The video sources (2) are IP cameras (RTSP), which transmit at 25 FPS. The logic would indicate that an MQTT message would be sent every app 10 seconds.

However, after starting the application, you can see a time lag between what the camera shows and the processed video output sink. I could see that this time lag was cumulative and was about 10 seconds of lag for every minute elapsed.

This led me to test “deepstream-app” with different settings and the same video sources to see if I could replicate the problem with “deepstrem-app”. Below you can see some of the relevant results obtained:

1- nvinfer
prueba_1

2- nvinfer + tracker(NvDCF) with pgie interval = 1
prueba_4

3- nvinfer + tracker(NvDCF) with pgie interval = 4
prueba_3

nvinfer + tracker(NvDCF) + analytic with pgie interval = 4
prueba_12

From the tests it can be seen that just adding the tracker, there is a decrease in the processed FPS. This decrease increases if the parameter interval = 1. However if the interval parameter is set to = 4 the decrease in FPS is less.

The configuration used in the application that I am developing is similar to the last one described (nvinfer + tracker + analytic with pgie interval = 4), therefore the processing speed should be approximately 21-22FPS, this implies that there would be about 4 frames without process per second and consequently about 240 raw frames per minute. Also considering a processing at 21FPS, it would take approximately 11.4 seconds to process these 240 frames, which would be consistent with the app lag 10 seconds per minute in my application.

To prove that this was the problem, I lowered the FPS of the cameras to 21, thus the application stopped having this cumulative lag. What catches my attention is that with “deepstream-app” this lag did not occur even though the cameras were transmitting at 25 FPS. What property should be configured to avoid processing all the frames and avoid this lag without having to lower the FPS of the cameras?

You may try to enlarge nvinfer property interval to see if have improvement,
and make sure set live-source to 1 in streammux for rtsp source input.

By increasing the interval parameter too much, could it affect the performance of the tracker for example?

Besides I realized that I had not used the parameter “drop-frame-interval” in the video sources (sources group). Finally I am looking for a trade-off between “interval” and “drop-frame-interval” without needing to change the FPS on the video cameras directly.

Based on your experience, is this an appropriate approach?

By increasing the inteval, the bbox will decrease, and tracker performance will increase.

Interesting, I had the idea that delivering bboxs more frequently to the tracker would increase the performance of the tracker.
I will carry out tests with your recommendations and I will comment when I have results.

Thank you