Please provide complete information as applicable to your setup.
• Hardware Platform (Jetson / GPU) RTX 3080 • DeepStream Version 6.1.1 • JetPack Version (valid for Jetson only) • TensorRT Version 8.6
**• DeepStream API ** Python API
I am running an application with a heterogeneous camera setup where I have a couple of v4l2 cameras and a couple of IP cameras providing RTSP streams. When I run the application the V4L2 cameras run in real-time but RTSP cameras have a huge delay which accumulates over time. I haven’t been able to find the latency property in uridecodebin. How do I work my way around this?
The IP cameras RTSP streams are transferred through ethernet, the V4L2 camera streams are transferred through CSI, USB,… The hardware interfaces are different physically, how can you require them to have the same delay?
Thank you Fiona for your input. I think there is a bit of a misunderstanding, the point was that the build-up of frames in RTSP causes stuttering and loss of frames for the v4l2 device. My question was aimed at how can I prevent the RTSP delay/latency from affecting my v4l2 source’s throughput.
Regarding the second part, I am aware there is rtspsrc element at play inside the bin. I wanted to know if uridecodebin can be somehow used to access the latency property of rtspsrc, which as per your suggestion can be achieved using GstChildProxy, which I will try.