Deepstream slow inferencing

• Hardware Platform (Jetson / GPU) - Jetson AGX Xavier[16gn]
• DeepStream Version DS 4.0
• JetPack Version (valid for Jetson only)- L4T 32.3.1 [ JetPack 4.3 ]
• TensorRT Version-TensorRT:
• NVIDIA GPU Driver Version (valid for GPU only)
NVIDIA Jetson AGX Xavier [16GB]
L4T 32.3.1 [ JetPack 4.3 ]
Ubuntu 18.04.4 LTS
Kernel Version: 4.9.140-tegra
CUDA 10.0.326
CUDA Architecture: 7.2
OpenCV version: 4.1.1
OpenCV Cuda: NO
Vision Works:
VPI: 0.1.0

I’ve been trying to familiarize myself with the Deepstream SDK. I modified the sample “Deepstream-test2” application to pick up an RTSP stream using the URIDecodeBin Element. The original application had good frame rate w/ the local video, but the modified application now refreshes the frame every 5 seconds. Solely running the stream is reliable and smooth, so it makes me suspect something with inferencing and tracking that is the issue. What are possible issues I should look for? Where can I find resources about the lower level libraries? Please advise

Pipeline for application:

It’s better to upgrade to DeepStream 5.0 GA. You can download it here: And the document is

Since there is no useful information in your description, please refer to to find whether there is anything helpful.