I have been searching for streaming processed video over network. I cannot stream anything using gstreamer not only for processed frames.
1- Capture input stream (done)
2- Process video (done)
3- Display the processed image (done)
4- Stream processed image (NOT DONE)
I need help on 4th step. I tried to do it with opencv videowriter.
cap = cv2.VideoCapture("/home/nvidia/Downloads/s1.mp4")
out = cv2.VideoWriter("appsrc ! video/x-raw, format=BGR ! queue ! videoconvert ! video/x-raw, format=BGRx ! nvvidconv ! omxh264enc ! video/x-h264, stream-format=byte-stream ! h264parse ! rtph264pay pt=96 config-interval=1 ! udpsink host=127.0.0.1 port=5001", cv2.CAP_GSTREAMER, 0, 25.0, (1920,1080))
ret, frame = cap.read()
if cv2.waitKey(1) & 0xFF == ord('q'):
# Release everything if job is finished
command for displaying the stream
gst-launch-1.0 -e udpsrc port=50001 ! application/x-rtp, encoding-name=H264, payload=96 ! queue ! rtph264depay ! h264parse ! nvv4l2decoder ! nvvidconv ! video/x-raw, format=BGRx ! videoconvert ! xvimagesink
I can not view the stream.
Probably the port is not set correctly. It should be 5001 instead of 50001. Might fail due to the typo.
Tried that but didn’t work either.
We try to run the sample on JP4.4.1(r32.4.4) and it works well:
cap = cv2.VideoCapture("filesrc location=/home/nvidia/sample_1080p_h264.mp4 ! qtdemux ! h264parse ! nvv4l2decoder ! nvvidconv ! video/x-raw, format=BGRx ! videoconvert ! video/x-raw,format=BGR ! appsink ")
w = cap.get(cv2.CAP_PROP_FRAME_WIDTH)
h = cap.get(cv2.CAP_PROP_FRAME_HEIGHT)
fps = cap.get(cv2.CAP_PROP_FPS)
print('Src opened, %dx%d @ %d fps' % (w, h, fps))
gst_out = "appsrc ! video/x-raw, format=BGR ! queue ! videoconvert ! video/x-raw,format=BGRx ! nvvidconv ! nvv4l2h264enc ! h264parse ! rtph264pay pt=96 config-interval=1 ! udpsink host=127.0.0.1 port=5001 "
out = cv2.VideoWriter(gst_out, cv2.CAP_GSTREAMER, 0, float(fps), (int(w), int(h)))
if not out.isOpened():
print("Failed to open output")
ret_val, img = cap.read();
if not ret_val:
print "pipeline open failed"
if __name__ == '__main__':
You may refer to it and give it a try.
If you use JP4.4(r32.4.3), please remove
h264parse for a try:
gst-launch-1.0 -e udpsrc port=5001 ! application/x-rtp, encoding-name=H264, payload=96 ! queue ! rtph264depay ! nvv4l2decoder ! nvvidconv ! video/x-raw, format=BGRx ! videoconvert ! xvimagesink
When I try to run above example that you shared with me all I see is a blank screen. Does not throws me any error. Since I use JP4.4 I tried to remove h264 parse command from the pipeline. Now I have
failed to open output error.
Please check if you can run with
Looks like the error is reported from
qtdemux. Probably it is not h264 stream in s1.mp4. We try with sample_1080p_h264.mp4 and it runs fine.
When I check the codec I see h264 for video that I want to stream.
We would suggest clarify if the failure is specific to decoding s1.mp4. Or you also fail in decoding sample_1080p_h264.mp4.
Since I did not flashed deepstream during the installation on JetPack 4.4 I dont have that video. But I will try it.
You can try sample files in
Since it is mkv files, you would need to replace
qtdemux with matroskademux.
@DaneLLL, when I try to run exactly same code and pipeline except for the input video which I am sure that it is compressed with h264, I can see that the stream starts from the port that I specified. However I can not display the frames streamed by the code using the pipeline you mentioned.
Stream processed video with OpenCV on Jetson TX2 - #5 by DaneLLL code snippet
I use jetpack 4.4 btw.
For confirmation, so it works with
But fails with your own video file. Is it correct?
It does not work on both videos.