How to send video by rtsp in opencv

Hi NVs,
I am doing some test which send video by rtsp in jetson nano, and I have achieved it by udp connect just like below.

#!/usr/bin/env python
import cv2
import gi
gi.require_version('Gst', '1.0')
gi.require_version('GstRtspServer', '1.0')
from gi.repository import GObject, Gst, GstRtspServer

def main():
    
    # jetson 平台
    out_send = cv2.VideoWriter('appsrc is-live=true ! videoconvert ! \
                                omxh264enc bitrate=12000000 ! video/x-h264, \
                                stream-format=byte-stream ! rtph264pay pt=96 ! \
                                udpsink host=127.0.0.1 port=5400 async=false',
                                cv2.CAP_GSTREAMER, 0, 30, (1920,1080), True)

    if not out_send.isOpened():
        print('VideoWriter not opened')
        exit(0)
 
    rtsp_port_num = 8554 
 
    server = GstRtspServer.RTSPServer.new()
    server.props.service = "%d" % rtsp_port_num
    server.attach(None)
    
    factory = GstRtspServer.RTSPMediaFactory.new()
    factory.set_launch("(udpsrc name=pay0 port=5400 buffer-size=524288 \
                        caps=\"application/x-rtp, media=video, clock-rate=90000, \
                        encoding-name=(string)H264, payload=96 \")")

    factory.set_shared(True)
    server.get_mount_points().add_factory("/ds-test", factory)
 
    # 输出rtsp码流信息
    print("\n *** Launched RTSP Streaming at rtsp://localhost:%d/ds-test ***\n\n" % rtsp_port_num)    
 
    cap = cv2.VideoCapture("rtsp://admin:jiaxun123@192.168.170.65:554")
 
    while True:
        _, mat = cap.read()
        out_send.write(mat)
        cv2.waitKey(30) 
        
if __name__ == '__main__':
    main()

But I want to achieve it by TCP connect rather than udpsink and udpsrc. I know the useful elements are tcpserversink and tcpclient, but I can’t make the correct pipelient. So can you help me? Thanks!

Hi,
A user has shared some information about this and please refer to this post:
Gstreamer TCPserversink 2-3 seconds latency
And see if it can be applied to your use-case.

Hi @DaneLLL,

I have read this post, but cant solve my problem.

It’s final solution is udpsink method and I can achieve it, but the tcpserversink also has problem.

Thanks!

Would you try replacing rtph264pay with matroskamux ! tcpserversink? You could also see more info where the pipeline is failing by runningn in the terminal with a videotestsrc.

Alternatively you could use rtspclientsink and set protocol to tcp. I have found GitHub - aler9/rtsp-simple-server: ready-to-use RTSP / RTMP / LL-HLS / WebRTC server and proxy that allows to read, publish and proxy video and audio streams to be easy to setup and lot of functionality.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.