Yes, it works.
I want to do same thing on jetson. But the above command does not work on jetson.
I think “rtspsrc” is not available for jetson. I tried many things but it didn’t work.
So I researched alternatives. This is itSo I researched alternatives. This is it.
You may add verbose flag (-v) to gst-launch in the working case with playbin, it will tell what plugins and caps in between are used in the working pipeline.
Post these here if these are not clear to you, with some details about your stream from:
This is the result of gst-discoverer-1.0 rtsp://192.168.53.1/live.
Opening in BLOCKING MODE
Done discovering rtsp://192.168.53.1/live
Analyzing URI timed out
video: H.264 (Main Profile)
This result has too much output and I don’t know what to look at.
And I realized one thing.
The rtsp stream I want to see is actually the drone’s camera.
I used vlc from another computer to rtsp stream an mp4 video and your code (uridecodebin) worked fine.
Why can’t I see the drone’s camera…
I’m concerned about this ** CRITICAL ** : gst_buffer_add_video_time_code_meta_full: assertion 'GST_IS_BUFFER (buffer)' failed
This message does not appear when using vlc.
You may post this piece of information for me to try providing a working pipeline. You can stop after it displays, mostly the pipeline setup is relevant here. I cannot further help if not sharing that.
OK, command: gst-launch-1.0 -v uridecodebin uri=rtsp://192.168.53.1/live source::latency=0 ! nvvidconv ! video/x-raw,format=BGRx ! videoconvert ! video/x-raw,format=BGR ! appsink drop=1 > gst-v.txt
and output: gst-v.txt (50.1 KB)
This command got stuck, so I hit CTRL-C later.
(I tryed “appsink”. Is that ok?)
From the attached log, it seems running fine to me from gst-launch.
Using appsink with gst-launch is not so good, because no application will free the buffers on sink and therefore memory usage will continuously grow till outage if you don’t interrupt before. Better use fakesink instead for testing the pipeline.
You may confirm it receives/decodes fine with (assuming you have a X display):
# RTSP capture pipeline
gst_cap = "uridecodebin uri=rtsp://192.168.53.1/live source::latency=0 ! nvvidconv ! video/x-raw,format=BGRx ! videoconvert ! video/x-raw,format=BGR ! appsink drop=1"
cap = cv2.VideoCapture(gst_cap, cv2.CAP_GSTREAMER)
if not cap.isOpened() :
print("capture open failed")
w = cap.get(cv2.CAP_PROP_FRAME_WIDTH)
h = cap.get(cv2.CAP_PROP_FRAME_HEIGHT)
fps = cap.get(cv2.CAP_PROP_FPS)
print('Src opened, %dx%d @ %d fps' % (w, h, fps))
# Video writer to fpsdisplaysink using xvimagesink
gst_out = "appsrc ! video/x-raw, format=BGR ! queue ! videoconvert ! fpsdisplaysink video-sink=xvimagesink "
out= cv2.VideoWriter(gst_out, cv2.CAP_GSTREAMER, 0, float(fps), (int(w), int(h)))
if not out.isOpened():
print("Failed to open writer")
# Main loop
_, frame = cap.read()
# Some process here on frame...
# imshow may not be that efficient on some Jetsons such as Nano, better use videowriter:
If not working, you may post the generated output.
I tried latency=500, 1000, 2000, and removed.
The result did not change…
I may have to convey my apologies and gratitude to you.
I’m not a native English speaker so I apologize for being rude.
(Most of my messages are based on google translate.)
And thank you very much for taking time out for me.