Stream processed video with OpenCV on Jetson TX2

Hello everyone,

I have been searching for streaming processed video over network. I cannot stream anything using gstreamer not only for processed frames.

Steps:

1- Capture input stream (done)
2- Process video (done)
3- Display the processed image (done)
4- Stream processed image (NOT DONE)

I need help on 4th step. I tried to do it with opencv videowriter.

stream.py

import cv2

cap = cv2.VideoCapture("/home/nvidia/Downloads/s1.mp4")

out = cv2.VideoWriter("appsrc ! video/x-raw, format=BGR ! queue ! videoconvert ! video/x-raw, format=BGRx ! nvvidconv ! omxh264enc ! video/x-h264, stream-format=byte-stream ! h264parse ! rtph264pay pt=96 config-interval=1 ! udpsink host=127.0.0.1 port=5001", cv2.CAP_GSTREAMER, 0, 25.0, (1920,1080))

while cap.isOpened():
    ret, frame = cap.read()
    if ret:
        if out.isOpened():
            out.write(frame)
            print('writing frame')
        if cv2.waitKey(1) & 0xFF == ord('q'):
            break
    else:
        break

# Release everything if job is finished
cap.release()
out.release()

command for displaying the stream

gst-launch-1.0 -e udpsrc port=50001 ! application/x-rtp, encoding-name=H264, payload=96 ! queue ! rtph264depay ! h264parse ! nvv4l2decoder ! nvvidconv ! video/x-raw, format=BGRx ! videoconvert ! xvimagesink

I can not view the stream.

Hi,
Probably the port is not set correctly. It should be 5001 instead of 50001. Might fail due to the typo.

Tried that but didn’t work either.

Hi,
We try to run the sample on JP4.4.1(r32.4.4) and it works well:

import sys
import cv2

def read_cam():
    cap = cv2.VideoCapture("filesrc location=/home/nvidia/sample_1080p_h264.mp4 ! qtdemux ! h264parse ! nvv4l2decoder ! nvvidconv ! video/x-raw, format=BGRx ! videoconvert ! video/x-raw,format=BGR ! appsink  ")

    w = cap.get(cv2.CAP_PROP_FRAME_WIDTH)
    h = cap.get(cv2.CAP_PROP_FRAME_HEIGHT)
    fps = cap.get(cv2.CAP_PROP_FPS)
    print('Src opened, %dx%d @ %d fps' % (w, h, fps))

    gst_out = "appsrc ! video/x-raw, format=BGR ! queue ! videoconvert ! video/x-raw,format=BGRx ! nvvidconv ! nvv4l2h264enc ! h264parse ! rtph264pay pt=96 config-interval=1 ! udpsink host=127.0.0.1 port=5001 "
    out = cv2.VideoWriter(gst_out, cv2.CAP_GSTREAMER, 0, float(fps), (int(w), int(h)))
    if not out.isOpened():
        print("Failed to open output")
        exit()

    if cap.isOpened():
        while True:
            ret_val, img = cap.read();
            if not ret_val:
                break;
            out.write(img);
            cv2.waitKey(1)
    else:
     print "pipeline open failed"

    print("successfully exit")
    cap.release()
    out.release()


if __name__ == '__main__':
    read_cam()

You may refer to it and give it a try.

If you use JP4.4(r32.4.3), please remove h264parse for a try:

gst-launch-1.0 -e udpsrc port=5001 ! application/x-rtp, encoding-name=H264, payload=96 ! queue ! rtph264depay ! nvv4l2decoder ! nvvidconv ! video/x-raw, format=BGRx ! videoconvert ! xvimagesink

Hello @DaneLLL,

When I try to run above example that you shared with me all I see is a blank screen. Does not throws me any error. Since I use JP4.4 I tried to remove h264 parse command from the pipeline. Now I have failed to open output error.

Hi,
Please check if you can run with

/opt/nvidia/deepstream/deepstream-5.0/samples/streams/sample_1080p_h264.mp4

Looks like the error is reported from qtdemux. Probably it is not h264 stream in s1.mp4. We try with sample_1080p_h264.mp4 and it runs fine.

When I check the codec I see h264 for video that I want to stream.

Hi,
We would suggest clarify if the failure is specific to decoding s1.mp4. Or you also fail in decoding sample_1080p_h264.mp4.

Since I did not flashed deepstream during the installation on JetPack 4.4 I dont have that video. But I will try it.

Hi,
You can try sample files in
https://jell.yfish.us/

Since it is mkv files, you would need to replace qtdemux with matroskademux.

Hello again @DaneLLL, when I try to run exactly same code and pipeline except for the input video which I am sure that it is compressed with h264, I can see that the stream starts from the port that I specified. However I can not display the frames streamed by the code using the pipeline you mentioned.

Using Stream processed video with OpenCV on Jetson TX2 - #5 by DaneLLL code snippet
I use jetpack 4.4 btw.

Hi,
For confirmation, so it works with

/opt/nvidia/deepstream/deepstream-5.0/samples/streams/sample_1080p_h264.mp4

But fails with your own video file. Is it correct?

It does not work on both videos.

The issue was the pipeline itself. Solved using the How to stream video to network from python? - #2 by Honey_Patouceul pipeline.

1 Like