RTSP stream delay in deepstream-app using PeopleNet

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) Tesla P4
• DeepStream Version DS 5.1
• TensorRT Version 7.2.1
• NVIDIA GPU Driver Version (valid for GPU only) 460.80
• Issue Type( questions, new requirements, bugs) RTSP delay in visual while running the people net model over RTSP streams

I am running the default deepstream-app application with the peoplenet config with rtsp streams.

The application runs perfectly without any delay for sometime and when ever there are multiple persons on the view of the camera the streams lags and starts showing delay and reverts back to realtime after sometime.

I am using the pretrained PeopleNet model provided in the NGC.

So screenshots of the model detection:



Can anyone help to fix this delay so that the stream always stays realtime even with multiple persons detected.

Which application are you using? Can you send us the video for testing?

@Fiona.Chen Thanks for replying.

The application I’m using is the default deepstream-app sample code present in the /sources/apps/sample_apps.

We are not testing this on mp4 or h264 videos. I am running a live rtsp feed from the ip cam which has the above mentioned issue.

The below attached is the code and config that I am working on and facing the issue.

deepstream-app.tar.gz (133.5 KB)
deepstream_app_source1_peoplenet.txt (4.1 KB)
config_infer_primary_peoplenet.txt (2.0 KB)

The model used is the pretrained peoplenet model.

You can try set the “latency” property of source (rtspsrc) of uridecodebin.

gst-launch-1.0 rtspsrc **latency=200** location= ...
or
gst-launch-1.0 uridecodebin source::latency=200 ...

Hi @Fiona.Chen ,

Is there any suggestion in this issue for delay in the realtime stream after some time of running the application during multiple detections.