Deepstream python test1-rtsp-out with uridecodebin

I have modifed above example with the uridecodebin as the source in order to play .mp4s or other rtsp src, it won’t work straightway. I understand that I should only demux the h264 stream after the uridecodebin then it might be working, but I dont know how to do it. Can you please help?

Regards,
Kai

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU)
• DeepStream Version
• JetPack Version (valid for Jetson only)
• TensorRT Version
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs)
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

Are you familiar with gstreamer programming? GStreamer: open source multimedia framework

Yes. I am able to play a bit but not expert.

My pipeline is like

        source_bin = create_source_bin(i, uri_name)
        assert source_bin is not None
        pipeline.add(source_bin)
        sink_pad = streammux.get_request_pad("sink_%u" % i)
        assert sink_pad is not None
        src_pad = source_bin.get_static_pad("src")
        assert src_pad is not None
        src_pad.link(sink_pad)

   streammux.link(pgie)
    pgie.link(tracker)
    tracker.link(sgie_color)
    sgie_color.link(sgie_type)
    sgie_type.link(nvvidconv)
    nvvidconv.link(nvosd)
    nvosd.link(tee)

    tee_msg_pad = tee.get_request_pad('src_%u')
    assert tee_msg_pad is not None
    sink_pad_q1 = queue1.get_static_pad("sink")
    tee_msg_pad.link(sink_pad_q1)

    tee_render_pad = tee.get_request_pad("src_%u")
    assert tee_render_pad is not None
    sink_pad_q2 = queue2.get_static_pad("sink")
    tee_render_pad.link(sink_pad_q2)

    queue1.link(msgconv)
    msgconv.link(msgbroker)
    queue2.link(nvvidconv_postosd)
    nvosd.link(nvvidconv_postosd)
    nvvidconv_postosd.link(caps)
    caps.link(encoder)
    encoder.link(rtppay)
    rtppay.link(udpsink)

The problem is that I want to use uridecodebin as the source, but it always seems hangs up at the Frame Number = 3

gstname= video/x-raw
features= <Gst.CapsFeatures object at 0x7f83a9b0a8 (GstCapsFeatures at 0x7ec403b220)>
NvMMLiteOpen : Block : BlockType = 4 
===== NVMEDIA: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 4 
Frame Number = 0 Vehicle Count = 0 Person Count = 0
Frame Number = 1 Vehicle Count = 0 Person Count = 0
Frame Number = 2 Vehicle Count = 0 Person Count = 0
Frame Number = 3 Vehicle Count = 1 Person Count = 3
H264: Profile = 66, Level = 0 
NVMEDIA_ENC: bBlitMode is set to TRUE

Gstreamer is totally open source. For uridecodebin usage, you can find many resources in internet. Such as An Example for GStreamer Dynamic Pad (Decodebin) · GitHub

In deepstream, you can refer to deepstream-test3 sample code. C/C++ Sample Apps Source Details — DeepStream 5.1 Release documentation

From the log, it seems the uridecodebin has successfully created following elements which are almost identical to the test1-rtsp-out example, which is filesrc->h264parse->capsfilter->nvv4l2decoder. The example works for the .h264file, but when I change the source to uridecodebin , it jus hangs at frame3.

I will try to play more to see what is the problem

Decodebin child added: source 

Decodebin child added: decodebin0 

Decodebin child added: h264parse0 

Decodebin child added: capsfilter0 

Decodebin child added: nvv4l2decoder0

Can you upload your code? The behaviour depends on the details of the whole pipeline but not just piece of code.

ds-traffic-perception.py (29.3 KB)

Please help looking into it, thanks!

I managed to pin point the problem:

At the end of the pipeline, I have added a overlay sink

sink = Gst.ElementFactory.make("nvoverlaysink", "nvvideo-renderer")
    sink.set_property('sync', 0)

howevver, it is never linked in the pipeline, instead, a updsink is linked at the end of the pileline. Once I removed the overlaysink from the pipeline, it started to work.

I don’t know why but this seems to be the fix…

You never link the sink element to the pipeline. There is already a udpsink in your pipeline. You must make sure your pipeline before you coding.
What pipeline on earth do you want?

Please refer to deepstream-test3 sample for how to link sink to the pipeline. deepstream_python_apps/deepstream_test_3.py at master · NVIDIA-AI-IOT/deepstream_python_apps · GitHub

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.