Deepstream slow inferencing

• Hardware Platform (Jetson / GPU) - Jetson AGX Xavier[16gn]
• DeepStream Version DS 4.0
• JetPack Version (valid for Jetson only)- L4T 32.3.1 [ JetPack 4.3 ]
• TensorRT Version-TensorRT: 6.0.1.10
• NVIDIA GPU Driver Version (valid for GPU only)
NVIDIA Jetson AGX Xavier [16GB]
L4T 32.3.1 [ JetPack 4.3 ]
Ubuntu 18.04.4 LTS
Kernel Version: 4.9.140-tegra
CUDA 10.0.326
CUDA Architecture: 7.2
OpenCV version: 4.1.1
OpenCV Cuda: NO
CUDNN: 7.6.3.28
TensorRT: 6.0.1.10
Vision Works: 1.6.0.500n
VPI: 0.1.0

I’ve been trying to familiarize myself with the Deepstream SDK. I modified the sample “Deepstream-test2” application to pick up an RTSP stream using the URIDecodeBin Element. The original application had good frame rate w/ the local video, but the modified application now refreshes the frame every 5 seconds. Solely running the stream is reliable and smooth, so it makes me suspect something with inferencing and tracking that is the issue. What are possible issues I should look for? Where can I find resources about the lower level libraries? Please advise

Pipeline for application:
uridecode->streammux->pgie->nvtracker->sgie1->sgie2->sgie3->nvvidconv->nvosd->sink

It’s better to upgrade to DeepStream 5.0 GA. You can download it here: https://developer.nvidia.com/deepstream-sdk. And the document is https://docs.nvidia.com/metropolis/deepstream/dev-guide/index.html#page/DeepStream_Development_Guide/deepstream_support.html

Since there is no useful information in your description, please refer to https://docs.nvidia.com/metropolis/deepstream/dev-guide/index.html#page/DeepStream_Troubleshooting_2019/deepstream_plugin_troubleshooting.html# to find whether there is anything helpful.