I am currently using
• Hardware Platform (Jetson) - Xavier NX
• DeepStream Version - 6.1.1
• JetPack Version - 5.0.2
• TensorRT Version - 18.104.22.168
• Issue Type - bugs/questions
• How to reproduce the issue ? - Running deepstream-test-3.py app with 2 live RTSP streams
• Requirement details - To output the live video (in real-time), after inferencing, in a window
When I use 2 live RTSP streams as input to the application, I notice in the output window that one stream is displayed in real-time with negligible latency while the 2nd video stream has a noticeable latency compared to the first video stream. And this behavior is consistent for a given RTSP stream. In an effort to debug, I tried interchanging the positions of the 2 RTSP URIs in the command arguments, but that didn’t help. However, when I tried to run the 2 RTSP streams in two separate instances of the application at the same time, I could see that the output video for both streams is being displayed with negligible latency (live and in sync) in 2 separate windows. Is there something I can change in the application so that I can run a single application with 2 input RTSP streams with both displaying output video in real-time with low latency and sync?
I was running two different instances of the same application with one RTSP input each, on the same Xavier NX board., and I see no difference in latency. Whereas when I input 2 RTSP streams to only one instance of the application, I see differences in latency of the 2 output streams. I also monitored the performance for both the use cases using tegrastats and Jetson Power GUI, but there’s no significant difference in GPU or CPU utilization. Also NVDEC1 and NVDEC2 were active for both use cases. I believe there is something at the application level that can be configured to maintain synchronization between 2 video streams.