When I use single source input and nvidia video renderer in composer to display rtsp video, there will be lag

• Hardware Platform (Jetson / GPU) : NVIDIA GeForce 1080ti, CUDA: 12.0
• DeepStream Version : 6.2.0
• TensorRT Version : 8.5.2
• NVIDIA GPU Driver Version (valid for GPU only) : 525.85.12

The graph is shown below, which directly connects to the nvidia video renderer using single source input and sets the sync and qos properties of the nvidia video renderer to false. The rtsp device and the host are connected through a router, and both are plugged into the same router using a network cable.

Maybe you can try to tune the RTSP parameters to improve the quality.

Does your RTSP server support TCP protocol? If so, you can set " select-rtp-protocol to 4
And the " latency" can be a larger value. To enlarge " udp-buffer-size" may also help.


Hi, I tried this like you said, select-rtp-protocol=4, latency=100, udp-buffer-size = udp-buffer-size * 3 / *4… But it did not work. This rtsp video read with opencv is smooth.

You can try to enable “sync” with your video render extension.

It still did not work.

Hi, that situation still exists. how to fix it? This is the log: "[595,585ms] [Error] [omni.kit.app._impl] [py stderr]: WARNING from element NVidia Video Renderer/NVidia Video Renderer23-sink: A lot of buffers are being dropped.
WARNING from element NVidia Video Renderer/NVidia Video Renderer23-sink: A lot of buffers are being dropped.2023-05-24 08:30:34 [595,585ms] [Error] [omni.kit.app._impl] [py stderr]: "

Please monitor the GPU usage with “nvidia-smi dmon” while you running the graph.

According to the log, the hardware decoder is not working. What is the video format inside your rtsp source?

H264, is this the video format you want to ask?

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks

Yes. It is strange that the hardware decoder is not enabled.

You can monitor the CPU loading for your case.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.