• Hardware Platform (GPU)
NVIDIA GeForce GTX 1070 • *** Deep Stream Version
6.3
**• NVIDIA GPU Driver Version **
530.30.02
Task:
I have several cameras to which I connect via RTSP, it is necessary to run a detection and tracking model on them and display it on the screen. I run everything in a container that I launched on a remote server that I connect to via SSH. Since there is no monitor connected to the server, I use X11 and forward the display to the local machine (this part works).
What am I doing:
I’m using an example from the deepstream-test3 repository. I run it as follows
python 3 deepstream_test_3.py -i rtsp://bla bla bla
The code works fine, I get a video on the screen, but the problem is that the frames in this video are mixed up. They jump in time. That is, there is time on the frame and it shows that the frames are mixed. For example a frame 09:12:30, 09:12:31, 09:12:32, 09:12:30, 09:12:33, 09:12:31, 09:12:34
At startup, it creates
sink = Gst.ElementFactory.make(“nveglglessink”, “nvvideo-renderer”)
I tried to replace nveglglessink with xvimagesink and get the error as in the screenshot.
How to solve the task, what am I missing? What is the problem with time jumps?
I used a ready-made devel container. and the source code runs fine, but the frames are shuffled. how do the wrong dependencies affect the wrong frame order?
there was a suggestion that the problem is in uridecodebin and so I decided to use rtspsrc. To do this, a pipeline was written on gst-launch. As follows:
This pipeline is working correctly. After that I want to write the same pipeline but in python. for example, I write
pipeline = Gst.parse_launch(“rtspsrc location=rtsp://blabla latency=100 ! queue ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! videoscale ! video/x-raw, width=640, height=480 ! ximagesink”)… state.PLAYING…loop.run
I get warnings
“udpsrc gstudpsrc.c:1445:gst_udpsrc_open: warning: Could not create a buffer of requested 524288 bytes (Operation not permitted) Need net.admin privilege?”
And also
"udpsrc gstudpsrc.c:1445:gst_udpsrc_open: have udp buffer of 212992 bytes while 524288 were requested" .
But despite the warnings, I manage to connect to the rtsp server and display the video on the screen without mixing frames (although there are some lags).
But if I rewrite the same code but using ElementFactory as follows:
import gi
gi.require_version(‘Gst’, ‘1.0’)
from gi.repository import GLib, Gst
I made some progress and was able to write a pipeline that starts and displays the video stream in the correct order (not mixed frames). But when I try to add pgie for detection I get warnings and errors.
import gi
gi.require_version(‘Gst’, ‘1.0’)
from gi.repository import Gst, GLib
I also want to note that when I remove pgie from the pipeline, warnings remain, but there is no error and the code as a whole runs.
I also wanted to ask if there are ready-made pipelines using rtspsrc and not with decodebin. And what is the reason for the incorrect processing of the rtsp stream using decodebin in general