Display ZED Camera Stream to Gstreamer

I’m using the ZED stereoscopic camera to get a video stream and display to gstreamer. I’m using the zed streaming example and simply modifying it to display on a video writer rather than opencv like so:

import sys
import pyzed.sl as sl
import cv2

def main():

    init = sl.InitParameters()
    init.camera_resolution = sl.RESOLUTION.HD720
    init.depth_mode = sl.DEPTH_MODE.NONE
    cam = sl.Camera()
    status = cam.open(init)
    if status != sl.ERROR_CODE.SUCCESS:
        print(repr(status))
        exit(1)

    runtime = sl.RuntimeParameters()
    mat = sl.Mat()

    stream = sl.StreamingParameters()
    stream.codec = sl.STREAMING_CODEC.H264
    stream.bitrate = 4000
    stream.port = 30000
    status = cam.enable_streaming(stream)
    if status != sl.ERROR_CODE.SUCCESS:
        print(repr(status))
        exit(1)

    key = 0
    print("  Quit : CTRL+C\n")
    gst = "appsrc ! queue ! videoconvert ! video/x-raw,format=RGBA ! nvvidconv ! nvegltransform ! nveglglessink "
    vw = cv2.VideoWriter(gst, cv2.CAP_GSTREAMER, 0, 60, (1080, 720))
    while True and key !=113:
        err = cam.grab(runtime)
        cam.retrieve_image(mat, sl.VIEW.LEFT)
        vw.write(mat.get_data())
        # if (err == sl.ERROR_CODE.SUCCESS) :
        #     cam.retrieve_image(mat, sl.VIEW.SIDE_BY_SIDE)
        #     frame = mat.get_data()
        #     cv2.imshow("ZED", frame)
        #     key = cv2.waitKey(1)
        # else :
        #     key = cv2.waitKey(1)

    cam.close()

    cam.disable_streaming()
    cam.close()

if __name__ == "__main__":
    main()

From above my pipeline is:
gst = "appsrc ! queue ! videoconvert ! video/x-raw,format=RGBA ! nvvidconv ! nvegltransform ! nveglglessink " vw = cv2.VideoWriter(gst, cv2.CAP_GSTREAMER, 0, 60, (1080, 720))

and when I run the script I get this warning and the gstreamer window just displays my computer screen:
[ WARN:0] global /home/nvidia/host/build_opencv/nv_opencv/modules/videoio/src/cap_gstreamer.cpp (1629) writeFrame OpenCV | GStreamer warning: cvWriteFrame() needs images with depth = IPL_DEPTH_8U and nChannels = 3.

Your writer pipeline expects BGR frames, but maybe setting H264 codec makes your app sending H264 encoded video.
I’m unable to try now, but you may try such writer:

gst = "appsrc ! queue ! video/x-h264 ! h264parse ! nvv4l2decoder ! nvvidconv ! nvegltransform ! nveglglessink "
vw = cv2.VideoWriter(gst, cv2.CAP_GSTREAMER, 0, 60, (1080, 720))
print(vw.isOpened())

Note SDK may use ressources for H264 encoding and you are using resources for decoding. I have no experience with the SDK way, but you can easily get the camera feed with a gstreamer capture pipeline such as (for ZED as /dev/video2):

cap = cv2.VideoCapture("v4l2src device=/dev/video2 io-mode=2 ! video/x-raw,format=YUY2,width=2560,height=720,framerate=30/1 ! nvvidconv ! video/x-raw(memory:NVMM) ! nvvidconv ! video/x-raw,format=BGRx ! videoconvert ! video/x-raw,format=BGR ! appsink drop=1", cv2.CAP_GSTREAMER)

Also, it seems that in original code it was checking the grab return code that may fail and in this case make nothing. Just write into your writer where imshow was called in original code.

I tried the gst pipeline with the H264 decoding however I still get the same error. The vw.isOpened() prints true. I’m not sure how I can use the VideoCapture object above to display using gstreamer instead of opencv.

If you just want to display and don’t need any opencv processing the following pipelines should be enough (still assuming ZED as /dev/video2 and having GUI running a X server):

gst-launch-1.0 v4l2src device=/dev/video2 ! video/x-raw,format=YUY2,width=2560,height=720,framerate=30/1 ! xvimagesink

You may try other resolutions/framerates as reported by v4l2-ctl (provided by apt package v4l-utils):

v4l2-ctl --device=/dev/video2 --list-formats-ext

If you want to process frames with opencv, you may open a gstreamer pipeline capture as :

cap = cv2.VideoCapture("v4l2src device=/dev/video2 io-mode=2 ! video/x-raw,format=YUY2,width=2560,height=720,framerate=30/1 ! nvvidconv ! video/x-raw(memory:NVMM) ! nvvidconv ! video/x-raw,format=BGRx ! videoconvert ! video/x-raw,format=BGR ! appsink drop=1", cv2.CAP_GSTREAMER)

check it is correctly opened, then create the video writer:

out = cv2.VideoWriter("appsrc ! video/x-raw, format=BGR ! queue ! videoconvert ! video/x-raw,format=BGRx ! nvvidconv ! nvegltransform ! nveglglessink ", cv2.CAP_GSTREAMER, 0, 60.0, (1080, 720))

check it is correctly opened, then in loop do

while True :
	ret, frame = cap.read()
    out.write(frame)
    cv2.waitKey(1)

Thanks the gst-launch script works great and has the framerates I need, however, when I use the VideoCapture/Writers the frame rates get much slower (1 frame every 5 seconds). My code is simply:

import cv2

if __name__ == "__main__":
    cap = cv2.VideoCapture("v4l2src device=/dev/video1 io-mode=2 ! video/x-raw,format=YUY2,width=2560,height=720,framerate=30/1 ! nvvidconv ! video/x-raw(memory:NVMM) ! nvvidconv ! video/x-raw,format=BGRx ! videoconvert ! video/x-raw,format=BGR ! appsink drop=1", cv2.CAP_GSTREAMER)
    out = cv2.VideoWriter("appsrc ! video/x-raw, format=BGR ! queue ! videoconvert ! video/x-raw,format=BGRx ! nvvidconv ! nvegltransform ! nveglglessink ", cv2.CAP_GSTREAMER, 0, 60.0, (2560, 720))


    while True :
        ret, frame = cap.read()
        out.write(frame)
        cv2.waitKey(1)

Any recommendations on how I can make the pipeline faster so the python script is usable?

I do see the same…sorry for weird advice, I think this was working in previous releases.
For now, better not use the videoWriter and use cv2.imshow that seems better working:

    while True :
        ret, frame = cap.read()
        cv2.imshow("frame", frame)
        cv2.waitKey(1)

Well it seems it fails to keep sync. Just try:

out = cv2.VideoWriter("appsrc ! video/x-raw, format=BGR ! queue ! videoconvert ! video/x-raw,format=BGRx ! nvvidconv ! video/x-raw(memory:NVMM),format=RGBA ! nvegltransform ! nveglglessink sync=0", cv2.CAP_GSTREAMER, 0, float(60), (int(2560), int(720)))

I tried on my other Jetson and the following configuration of VideoCapture/Writer worked:

    zed = cv2.VideoCapture("v4l2src device=/dev/video1 io-mode=2 ! video/x-raw,format=YUY2,width=2560,height=720,framerate=30/1 ! nvvidconv ! video/x-raw(memory:NVMM) ! nvvidconv ! video/x-raw,format=BGRx ! videoconvert ! video/x-raw,format=BGR ! appsink drop=1", cv2.CAP_GSTREAMER)
    gst = "appsrc ! queue ! videoconvert ! video/x-raw, format=RGBA ! nvvidconv ! \
           video/x-raw(memory:NVMM), width={}, height={} ! nvegltransform ! nveglglessink ".format(DISPLAY_WIDTH, DISPLAY_HEIGHT)
    vw = cv2.VideoWriter(gst, cv2.CAP_GSTREAMER, 0, 30, (DISPLAY_WIDTH, DISPLAY_HEIGHT))

Are both jetsons NX and do they run the same LT4 release ?

head -n 1 /etc/nv_tegra_release

Both are NX, this one is Jetpack 4.6 vs Jetpack 4.5 on the other one

I tested on AGX Xavier with R32.6.1, and it was unable to keep sync.
So you may add sync=0 into nveglglessink properties for pipeline in JetPack 4.6.

Could you elaborate on what you mean by unable to keep sync?

I think this is mainly related to gstreamer presentation timestamp.
If the gstreamer sink uses sync=true, the buffer may only be used (displayed, …) when it’s time for it.
Using sync=false may just use the buffer as it becomes available.
Hope this helps.

Thanks I’m not sure if adding the sync property changes much for me, however the video streaming works well now. Thanks for your help!