Different speed of two or more rtsp streams, low frame rate

• Hardware Platform (GPU)
NVIDIA GeForce GTX 1070
• *** Deep Stream Version
6.3
**• NVIDIA GPU Driver Version **
530.30.02

I took an example deepstream_test_3.py and I ran it on two rtsp streams. I got the wrong result that I expected, namely, the frames on the output video were mixed up in time (had the wrong order).

I assumed that the problem was in decodebin and decided to explicitly specify which encoder to use. I rewrote the source code and got a pipeline that you can see in the image.

I solved the problem with the frame order, but now I have other problems:

  1. Low frame rate from 0.4 to 7
  2. Two streams work at different speeds (that is, the time that is written on the frame is very different and the time difference increases). At the same time, if I connect to these cameras via VLC, the time will coincide, i.e. I will be getting frames with the current time from both cameras.

Can you guess what the problem might be and whether I built the pipeline correctly. I will also be glad to suggestions on how to solve this problem.

I will also leave a file with logs with level 4 debug

I will also leave a file with logs with level 4 debug
logs2.txt (241.6 KB)

Please refer to DeepStream SDK FAQ - Intelligent Video Analytics / DeepStream SDK - NVIDIA Developer Forums

Thanks for the answer, the first couple of points have already improved performance, but I have a question about the following lines

Continuing the discussion from DeepStream SDK FAQ:

I didn’t quite understand what the minimum pts\maximum pts parameters are. This is the Presentation Time Stamp, - this is a timestamp, that is, in fact, minimum pts is the initial timestamp and maximum pts is the final one. That is, I start the stream and take the very initial label and it will always be the minimum, the maximum label will always be the current one, isn’t it? Or do I need to recalculate to these values within some time interval

Thank you for your correction. It’s a typo. It actually refers to fps. You need to note that these parameters you attached are configured for New Nvstreammux.

I have 2 cameras, if one camera loses some packets and the other one doesn’t. and when I run them together, they work at different speeds (the time on the video does not match) . I understand correctly that in order to work correctly, you either need to synchronize frames, but at the same time we lose speed (quite significantly) or we will have a desynchronization. That is, with unstable packet sending from rtsp servers, synchronization will not be achieved?

You may think so. We need a balance between inferencing performance and synchronization of the streams. Even from the RTSP playback point, the player can not gurantee the smoothness when the stream keeps lossing frames.

and if I can’t guarantee that streams don’t lose packets. How can I force the threads to synchronize with each other? Or what are the options for how this can be done. Now I understand that if my first camera constantly loses N packets, then the gap between the streams will grow, is that so?

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks

If there is heavy packet lost. You may consult the camera vendor whether the camera support TCP connection. It is better to use TCP connection than UDP.

You can also set longer receiver latency to get more smooth streams. rtspsrc (gstreamer.freedesktop.org)

From DeepStream side, what you can do to to set correct nvstreammux parameters to try to sync the streams. DeepStream SDK FAQ - Intelligent Video Analytics / DeepStream SDK - NVIDIA Developer Forums

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.