I have cameras which make RTP stream(UDP,H264 encoded) and want to use deepstream to implement yolov3 model on these camera videos. I have the gstreamer command such as
I noticed that there exists examples for RTSP by changing config file. My question is it possible to use deepstream for RTP video? If it is possible how should I change the config file?
Hi, thank you for your answer but I think the reference post is about RTSP stream not RTP so configuration file for me would be different. I am not sure how to open RTP stream by making changes on configuration file. I think I can not write as RTSP:\… because it is an RTP stream. By the way I used the demo in DS4.0.1 Yolo file with usb camera and it worked fine. Next step for me is to open it with RTP stream if it is possible. I forgot to mention that I am taking the stream via ethernet.
I think you’re pretty close, actually. If your example works, then keep using udpsrc instead of rtspsrc. Probably the rest can stay the same as deepstream does (except for stuff like RTCP callbacks, etc.).
Oh thanks for fast reply but I could not get where to use udpsrc. I used the examples in Deepstream 4.0.1 and used the demos for CameraV4L2 which is the webcam(usb camera). Now instead of this I need to run yolo on the RTP stream which I can open with gstream with the command at the #1. Which config files should I change? I am a little bit confused.
Thanks for your replies, after commments I used the pipeline at #6 and then faced with an error which is due to lack of caps
then I used the following pipeline(can be beneficial for people who have the same problem):
I used queue2 to get rid of the packet drop, without queue there exist sudden loss on video. But I observed that delay increased and can not figure out what is the problem. It was about 240ms without using nvv4l2decoder, when I used nvv4l2decoder and nveglglessink it jumped 2s. Do you have any idea to solve this delay problem? In addition there still exists sudden drops in the video but less, so if you have any further recommendations I will be glad.
There’s another element you could experiment with - rtpjitterbuffer. It has a low likelihood of solving your problem, but you could at least try different values of its ‘mode’ property. It also has some other properties you might want to tune. Put it before rtph264depay.
mode : Control the buffering algorithm in use
flags: readable, writable
Enum "RTPJitterBufferMode" Default: 1, "slave"
(0): none - Only use RTP timestamps
(1): slave - Slave receiver to sender clock
(2): buffer - Do low/high watermark buffering
(4): synced - Synchronized sender and receiver clocks
You could also try setting the ‘sync’ and ‘max-lateness’ properties of nveglglessink to false and 0, respectively.