Error while running gstreamer pipeline with deepstream using python

Im trying to run gstreamer pipeline using deepstream and python.

This is the pipeline which is running successfully in terminal:

gst-launch-1.0 rtspsrc location="rtsp://admin:123456@192.168.0.150:554/H264?ch=1&subtype=0&proto=Onvif" latency=300 ! rtph264depay ! h264parse ! nvv4l2decoder drop-frame-interval=1 ! nvvideoconvert ! video/x-raw,width=1920,height=1080,formate=I420 ! queue !  nveglglessink window-x=0 window-y=0 window-width=1080 window-height=720

but while running the same pipeline using python code, im getting error as:

Error: gst-stream-error-quark: Internal data stream error. (1): gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstRTSPSrc:video-source/GstUDPSrc:udpsrc2:
streaming stopped, reason not-linked (-1)

This is the code that im using to run the gstreamer command:

import gi
import sys

gi.require_version("Gst", "1.0")

from gi.repository import Gst, GObject

def bus_call(bus, message, loop):
    t = message.type
    if t == Gst.MessageType.EOS:
        sys.stdout.write("End-of-stream\n")
        loop.quit()
    elif t == Gst.MessageType.ERROR:
        err, debug = message.parse_error()
        sys.stderr.write("Error: %s: %s\n" % (err, debug))
        loop.quit()
    return True


def main(device):
    GObject.threads_init()
    Gst.init(None)

    pipeline = Gst.Pipeline()

    source = Gst.ElementFactory.make("rtspsrc", "video-source")
    source.set_property("location", device)
    source.set_property("latency", 300)
    pipeline.add(source)

    depay = Gst.ElementFactory.make("rtph264depay", "depay")
    pipeline.add(depay)
    source.link(depay)

    parse = Gst.ElementFactory.make("h264parse", "parse")
    pipeline.add(parse)
    depay.link(parse)

    decoder = Gst.ElementFactory.make("nvv4l2decoder", "decoder")
    decoder.set_property("drop-frame-interval", 2)
    pipeline.add(decoder)
    parse.link(decoder)

    convert = Gst.ElementFactory.make("nvvideoconvert", "convert")
    pipeline.add(convert)
    decoder.link(convert)

    caps = Gst.Caps.from_string("video/x-raw,width=1920,height=1080,formate=I420")
    filter = Gst.ElementFactory.make("capsfilter", "filter")
    filter.set_property("caps", caps)
    pipeline.add(filter)
    convert.link(filter)

    queue = Gst.ElementFactory.make("queue", "queue")
    pipeline.add(queue)
    filter.link(queue)

    sink = Gst.ElementFactory.make("nveglglessink", "video-sink")
    sink.set_property("window-x", 0)
    sink.set_property("window-y", 0)
    sink.set_property("window-width", 1280)
    sink.set_property("window-height", 720)
    sink.set_property("sync", False)

    pipeline.add(sink)

    queue.link(sink)

    loop = GObject.MainLoop()

    bus = pipeline.get_bus()
    bus.add_signal_watch()
    bus.connect ("message", bus_call, loop)

    pipeline.set_state(Gst.State.PLAYING)

    try:
        loop.run()
    except:
        pass

    pipeline.set_state(Gst.State.NULL)

if __name__ == "__main__":
    main("rtsp://admin:123456@192.168.0.150:554/H264?ch=1&subtype=0&proto=Onvif")

Does anyone know what is the mistake? Thank you!

Hi,
For your reference, we have python samples in
https://github.com/NVIDIA-AI-IOT/deepstream_python_apps

Hi DaneLLL

I have wrote the above code by taking reference from the same official repository as you have suggested but still im not able to display the stream.

Hi,
Please try:

$ gst-launch-1.0 uridecodebin uri=rtsp://admin:123456@192.168.0.150:554/H264?ch=1&subtype=0&proto=Onvif ! nvvideoconvert ! nveglglessink

See if the uri can be played in gst-launch-1.0 command.

Also please share your platform, Jetson platform or desktop GPU.

Hi DaneLLL

The command is running successfully from terminal and I’m testing my application on gtx1080.

Please see here. Elements that have “sometimes” pads like rtspsrc need to be linked by callback. Elements with request pads you need to link by pad after creating one yourself with some_pad = element.get_request_pad(“pad_name”). Only Elements with always pads can be linked with element_a.link(element_b).

Personally, I found it easier to just use uridecodebin. It also needs to be linked by callback to the rest of the pipeline, but unlike rtspsrc, it also bundles in other necessary elements like depay and the decoder, and it works with file sources as well if you prefix the file with “file://”. I was able to link uridecodebin by callback to my inference bin (a grouping of elements), and it works fine.

Note: If you do create your own custom bin (not necessary, but can be handy) you will have to ghost pads (warning, really bad documentation there). There are also some gstreamer functions that can create ready-to link bins with already ghosted pads from strings, so you can use the same string (or a portion of it) that works on the command line with gst-launch.

Hi,
Please refer to deepstream-test3:
https://github.com/NVIDIA-AI-IOT/deepstream_python_apps/tree/master/apps/deepstream-test3
It may be a good choice to use uridecodebin. Please give the sample a try.

Thanks mdegans and DaneLLL for helping me out.

I’ve successfully converted the pipeline in python code using uridecodebin.