I solved the problem with the frame order, but now I have other problems:
Low frame rate from 0.4 to 7
Two streams work at different speeds (that is, the time that is written on the frame is very different and the time difference increases). At the same time, if I connect to these cameras via VLC, the time will coincide, i.e. I will be getting frames with the current time from both cameras.
Can you guess what the problem might be and whether I built the pipeline correctly. I will also be glad to suggestions on how to solve this problem.
I will also leave a file with logs with level 4 debug
I didn’t quite understand what the minimum pts\maximum pts parameters are. This is the Presentation Time Stamp, - this is a timestamp, that is, in fact, minimum pts is the initial timestamp and maximum pts is the final one. That is, I start the stream and take the very initial label and it will always be the minimum, the maximum label will always be the current one, isn’t it? Or do I need to recalculate to these values within some time interval
I have 2 cameras, if one camera loses some packets and the other one doesn’t. and when I run them together, they work at different speeds (the time on the video does not match) . I understand correctly that in order to work correctly, you either need to synchronize frames, but at the same time we lose speed (quite significantly) or we will have a desynchronization. That is, with unstable packet sending from rtsp servers, synchronization will not be achieved?
You may think so. We need a balance between inferencing performance and synchronization of the streams. Even from the RTSP playback point, the player can not gurantee the smoothness when the stream keeps lossing frames.
and if I can’t guarantee that streams don’t lose packets. How can I force the threads to synchronize with each other? Or what are the options for how this can be done. Now I understand that if my first camera constantly loses N packets, then the gap between the streams will grow, is that so?