Gstreamer pipeline for streaming and capture h264

Hi,
I’m using this pipeline for streaming processed frames:

pipeline = Gst.parse_launch(‘appsrc name=m_appsrc ! capsfilter name=m_capsfilter ! videoconvert ! x264enc ! rtph264pay ! udpsink name=m_udpsink’)

i can capture frames with appsink

cap = cv2.VideoCapture(
‘udpsrc port=5004 caps = “application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264”’
’ ! rtph264depay’
’ ! avdec_h264’
’ ! videoconvert’
’ ! appsink’, cv2.CAP_GSTREAMER)

But i want to recieve frame on NVR and i want to know url for connection.
When I try to connect by url rtsp://127.0.0.1:5004 with opencv:

cap = cv2.VideoCapture(‘rtsp://127.0.0.1:5004’)

I get error:

[tcp @ 0x2f0cf80] Connection to tcp://127.0.0.1:5004?timeout=0 failed: Connection refused

How can I find the url to connect to the stream?

Thank you in advance!

UPD: I’m trying to send and recieve frames on the same Jetson Nano, but in different docker containers (run with flag --net=host ).
I found example for rtsp streaming, added 276-283 lines to my code and run pipeline without errors. In second container I run this script:

cap = cv2.VideoCapture('rtsp://localhost:8554/ds-test', cv2.CAP_FFMPEG)
if cap.isOpened():
    print('opened')

But video is not opening.

Why do you think you can send a UDP stream but received it with RTSP in opencv? Do you know the difference between the two protocols? Can you google your problem by yourself?

It has nothing to do with deepstream.

Can the Nvidia sample code run in your platform? Please debug your code by yourself.

It’s my mistake. I understood that i need create rtsp server for streming, not udpsink.

I run deepstream-test1-rtsp-out in docker, and in second docker video was opened with using VideoCapture. All properties for udpsink and RTSPServer are similar to properties in deepstream-test1-rtsp-out.

Is there any other issue? If not, we will close this topic.

deepstream-test1-rtsp-out work proper way, but not my script
when my script is running, cv2.VideoCapture('rtsp://localhost:8554/ds-test', cv2.CAP_FFMPEG) in another container waits about 40secs and doesn’t open.
when script isn’t running, VideoCapture exit immediately.

deepstream-test1-rtsp-out uses .h264 file as source, my script uses buffer, I think it is main difference, that requires additional capsfilter properties
but I can not find structured information about the correct capsfilter setting

When you use deepstream-test1-rtsp-out, you can capture the output by cv2.VideoCapture(‘rtsp://localhost:8554/ds-test’, cv2.CAP_FFMPEG), right?

Yes!

So please refer to the sample.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.