Streaming opencv frames using gstreamer and reading them on node

hello,
i am trying to stream a videocapture over network. I have used fastapi and uvicorn for this and it worked well but now i am moving to wireless network and the network can’t handle the stream, im getting 2-3fps with 5 sec lag. I read that gstreamer is the best way to stream the frames, although i will need a decoder on the receiving end of the stream.

this is my sending pipelines:

camset = 'v4l2src device=/dev/video0 ! video/x-raw,width=640,height=360 ! nvvidconv flip-method=0 \
        ! video/x-raw(memory:NVMM), format=I420, width=640, height=360 ! nvvidconv ! video/x-raw, format=BGRx ! videoconvert \
        ! video/x-raw, format=BGR enable-max-performance=1 ! appsink '

appsink = "appsrc ! videoconvert ! x264enc tune=zerolatency bitrate=500 speed-preset=superfast !\
     rtph264pay ! udpsink host=127.0.0.1 port=8000"

right now this is my receiving pipeline but it doesn’t work:

camSet='udpsrc host=127.0.0.1 port=8000 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, \
        encoding-name=(string)H264,\
        payload=(int)26" ! rtph264depay ! omxh264dec ! videoconvert ! appsink'

i am receiving in python to see if my stream even works and it doesn’t seem like it does, or perhaps my receiving pipeline doesn’t work.

in the end i’ll need a node server that will receive the stream, decode it and show it with html.
i would really appreciate any help as i am new to gstreamer and i can’t seem to find anything about receiving and decoding the stream

Hi,
Please try hardware encoder nvv4l2h264enc. Some samples for reference:
Stream processed video with OpenCV on Jetson TX2 - #5 by DaneLLL
OpenvCV, Gstreamer, Python camera capture/access and streaming to RTP

And can set insert-sps-pps=1 idrinterval=15 for a try like:
Gstreamer TCPserversink 2-3 seconds latency - #5 by DaneLLL

after a little bit of research i’ve changed my sender and receiver pipes to these:
Sender:

gst_str_rtp = "appsrc ! video/x-raw,format=BGR ! queue ! videoconvert ! video/x-raw,format=BGRx ! nvvidconv !\
     video/x-raw(memory:NVMM),format=NV12,width=640,height=360,framerate=52/1 ! nvv4l2h264enc insert-sps-pps=1 \
        insert-vui=1 idrinterval=30 ! h264parse ! rtph264pay ! udpsink host=169.254.84.12 port=5004 auto-multicast=0"

receiver:

camSet='udpsrc port=5004 auto-multicast=0 ! application/x-rtp,media=video,encoding-name=H264 ! \
        rtpjitterbuffer latency=0 \
    ! rtph264depay ! decodebin ! nvvidconv ! video/x-raw,format=BGRx ! \
        videoconvert ! video/x-raw,format=BGR ! appsink drop=1

these pipes work for a stream on the same computer, but when trying to stream over network to a different computer, the second computer receiver isn’t working, videocapture keeps getting None

Hi,
Please check if this is the IP address of receiver:

this is the IP of the receiver, that is why on the receiver side i did not add a host or address

Hi,
Is the different computer also a Jetson platform? Some plugins are only available on Jetson platforms. On other embedded platforms, would need to use other plugins.

the receiving computer is a windows computer running windows 10
i used gst launch and got a good frame like this:

gst-launch-1.0 udpsrc port=5004 ! application/x-rtp,media=video,encoding-name=H264 ! queue ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert !autovideosink

but when using the pipeline in opencv, the program is stuck on videocapture.
maybe the pipeline isn’t accurate but i tried a few pipelines in opencv and none worked.

in the end i am trying to restream the received frames into an html server, i want to use the opencv to do so, but i can also stream directly from gst-launch im just not sure how. the html takes the frames from the web.

Hi,
On Linux system, for putting gstreamer pipeline in cv2.VideoCapture(), wound need to build OpenCV with -D WITH_GSTREAMER=ON. Not sure but it may be same for Windows system. This would need other users to share experience.

this is my cv2.getBuildInformation()
opencv has been built with gstreamer and it runs well.

my code:

def main():

    global video_frame

    camSet='udpsrc port=5004 auto-multicast=0 ! application/x-rtp,media=video,encoding-name=H264 ! rtpjitterbuffer latency=0 \

    ! rtph264depay ! avdec_h264 ! nvvidconv ! video/x-raw,format=BGRx ! \

        videoconvert ! video/x-raw,format=BGR ! appsink drop=1'

    print("after camset")

    cap = cv2.VideoCapture(camSet,cv2.CAP_GSTREAMER)

    while True:

        print("in loop")

        ret, frame = cap.read()

        if ret:

            cv2.imshow('stream',frame)

        else:

            print("Bad frame received")

        #outvid.write(frame)

        if cv2.waitKey(1)==ord('q'):

            break

    cap.release()

    cv2.destroyAllWindows()

    os._exit(0)

    exit()

not sure why but it gets stuck on videocapture with no logs or errors

Hi,
Does it work if you remove nvvidconv plugin? It is only available on Jetson platforms.

i thought nvvidconv was for any nvidia HW.
regardless, i removed nvvidconv and it still got stuck on videocapture,
i removed videoconvert as well and it didnt get stuck on video capture but didn’t manage to capture anything. need to mess more with pipes i guess

Hi,
On Linux if this command works:

gst-launch-1.0 udpsrc port=5004 ! application/x-rtp,media=video,encoding-name=H264 ! queue ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! video/x-raw, format=BGR ! fakesink

This string should work in cv2.VideoCapture():

udpsrc port=5004 ! application/x-rtp,media=video,encoding-name=H264 ! queue ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! video/x-raw, format=BGR ! appsink

Not sure but it should be same on Windows. Please give it a try.

not sure why but it just gets stuck like this. if i do cv2.namedWindow(‘stream’) at the start of the code the window is not responding.
is there any log i can open to see what is wrong?

There is no update from you for a period, assuming this is not an issue any more.
Hence we are closing this topic. If need further support, please open a new one.
Thanks

Sorry for the late response, is this still an issue to support? Thanks