Capturing a stream from raspberry via Gstreamer

I am trying to capture a vide stream coming from a rapberry zero. I can stream the video via gstreamer using below command on terminal.
gst-launch-1.0 -v v4l2src device=/dev/video0 num-buffers=-1 ! video/x-raw, width=640, height=480, framerate=30/1 ! videoconvert ! jpegenc ! rtpjpegpay ! udpsink host= port=5200
And I can capture the stream from nano using the terminal command.
gst-launch-1.0 -v udpsrc port=5200 ! application/x-rtp, media=video, clock-rate=90000, payload=96 ! rtpjpegdepay ! jpegdec ! videoconvert ! autovideosink

How can I integrate those commands into a python script so that I can process the stream with opencv. I want to use a python script both on the sender raspberry and on the receiving jetson.

Everything works fine on linux terminal but could not succeed to make it work on python. Any assitance?

below is the python script and it can not capture the frame.

import numpy as np
import cv2

cimport numpy as np
import cv2

cap =cv2.VideoCapture(‘udpsrc port=5200 caps = “application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96” ! rtph264depay ! decodebin ! videoconvert ! appsink’, cv2.CAP_GSTREAMER)
while True:

ret,frame =

if not ret:
    print('empty frame')

cv2.imshow('receive', frame)
if cv2.waitKey(1)&0xFF == ord('q'):
import time
import Jetson.GPIO as GPIO
import jetson.inference
import jetson.utils
import cv2

LED_PIN = 11

net = jetson.inference.detectNet(model='models/fruit/ssd-mobilenet.onnx',

# Open the CSI camera using OpenCV
camera = cv2.VideoCapture(gstreamer_pipeline(flip_method=0), cv2.CAP_GSTREAMER)

# Set the desired width and height of the camera
width, height = 1280, 720
camera.set(cv2.CAP_PROP_FRAME_WIDTH, width)
camera.set(cv2.CAP_PROP_FRAME_HEIGHT, height)

# Create a window to display the camera feed
window_title = "CSI Camera"
cv2.namedWindow(window_title, cv2.WINDOW_AUTOSIZE)

def main():
    while True:
            ret, frame =
            if not ret:

            # Convert BGR image to RGBA for jetson_inference
            img_rgba = cv2.cvtColor(frame, cv2.COLOR_BGR2RGBA)

            # Get the width and height of the image
            width, height = frame.shape[1], frame.shape[0]

            detections = net.Detect(img_rgba, width, height)

            # Display the image with detected objects
            cv2.imshow(window_title, frame)

            GPIO.output(LED_PIN, GPIO.LOW)
            Apple_detected = False
            for detection in detections:
                if detection.ClassID == 1:
                    GPIO.output(LED_PIN, GPIO.HIGH)
                    print("Apple detected")
                    Apple_detected = True

            if not Apple_detected:
                GPIO.output(LED_PIN, GPIO.LOW)

            if cv2.waitKey(10) & 0xFF == 27:

        except Exception as e:
            print("Error capturing image from camera:", str(e))

    # Release the camera and close the window when done

if __name__ == "__main__":

I’ve tried using the provided code, but I’m encountering an issue. The camera starts up as expected, but it doesn’t seem to detect any fruits, and I don’t see any bounding boxes being drawn around them. Could you please assist me in resolving this problem? I’m not sure why the fruit detection and bounding box drawing are not functioning as intended.

Please refer to the python samples:
Doesn't work nvv4l2decoder for decoding RTSP in gstreamer + opencv - #3 by DaneLLL
Displaying to the screen with OpenCV and GStreamer - #9 by DaneLLL
Stream processed video with OpenCV on Jetson TX2 - #5 by DaneLLL
OpenvCV, Gstreamer, Python camera capture/access and streaming to RTP

These are similar to this use-case. Please refer to the samples for developing your use-case.

I went through those samples but the use-cases are different. I can get a stream from http or rtsp but it has a bigger lag comparing to gstreamer.

I am stuck with this problem, did anyone succeed to capture gstreamer video streamed over wifi with opencv

Do you run this command on Raspberry PI?

gst-launch-1.0 -v v4l2src device=/dev/video0 num-buffers=-1 ! video/x-raw, width=640, height=480, framerate=30/1 ! videoconvert ! jpegenc ! rtpjpegpay ! udpsink host= port=5200