Deepstream RTSP Usability in Python

Please provide complete information as applicable to your setup.

**• Hardware Platform (Jetson )
**• DeepStream Version 6.1.1 / 6.2
**• Issue Type( Question)

I have 3 different JEtson devices, each with a Docker container on them.

Each of these Dockers uses either DS 6.1.1 or DS 6.2. Each device processes 3 different RTSP streams at 15 FPS each.

The solution works quite well, although one of the RTSP streams may occasionally fail for a certain period of time. This ensures that the container shuts down and the application can no longer be started until the stream is online again.

I would like to solve this situation. Now, during my research, I came across the option of “RTSP reconnect attemps”. According to the documentation, this can be endless without the pipeline interrupting the processing of the other streams. Is this correct?

And is there an implementation in Python too?
Unfortunately, the forum post was not followed up further.

1 Like

Hi @patrick.brocks1

What we did to solve the issue was switching between the actual RTSP stream and a dummy source when we detect an error by reading the GstBus. This solution uses Interpipes, and someone actually added it as an example to the Interpipes repository that is currently on PR. I hope this helps.

1 Like

If you use python, you can try to use the nvmultiurisrcbin or the nvurisrcbin and set the rtsp-reconnect-interval parameter.

1 Like

Thanks for your reply. i will work on this scenario also an update this thread part for part when i do progress on it ------

I tried to use the python-code attached to your link, but unfortunately im getting an error like :

Traceback (most recent call last):
File “test_pipeline.py”, line 272, in
main(sys.argv)
File “test_pipeline.py”, line 224, in main
create_source_pipeline(index, stream_url)
File “test_pipeline.py”, line 117, in create_source_pipeline
input_interpipesink.set_property(“name”, f"sink{index}")
AttributeError: ‘NoneType’ object has no attribute ‘set_property’

technically this code does have to work in my environment, right? what am i missing here?

##EDIT:
I found the missing part. now i can run the pipe. But the Behaviour of this is that when the rtsp source is disconnecting, the EOS Event triggers and the app exited.

Terminal :

My use case would be that in case of an URI is disconnecting, the dummy pipeline would trigger. So i just comment out the quit method :

image

now it seems working.

my question now is : How do i combine this funtionality with an inference or more like in the analytics sample? seems that i am stuck at this point.

Is there any example given to use this source bin ? in the documentation i only find clues for using it as "manually switching between sources, assumed that both ressources are online. But there is no errorhandling given in the documentation. in my usecase i have to react on an RTSP error, right? . Im not that deep into coding, so where do i have to make changes Supposing i use the analytics example pipeline in python ?

You can refer to the opt\nvidia\deepstream\deepstream\sources\apps\sample_apps\deepstream-server\deepstream_server_app.cpp. You can also refer to the Guide: how-to-use-nvmultiurisrcbin-in-a-pipeline.

Yeah this is C code. So are there any Python examples?

And i tried to find the implementation of nvmultisrcbin in “/opt/nvidia/deepstream/deepstream/sources/gst-plugins/gst-nvmultiurisrcbin/”, but there is no plugin named so. on both (6.1.1 and 6.2) .

System :

image

@yuweiw

Hi Miguel, at the moment I’m experimenting with the interpipe possibility. The default pipeline on the website you’re sharing is already working. I only noticed there that 2 individual pipelines are being built. Is there an example, or a way to connect the interpipes via a streammuxer? I made a sketch for this. I’m thinking of something like this:

@miguel.taylor

Hi @patrick.brocks1

I don’t have an example of nvstreammux switching exactly like in your diagram that I can share, but we have implemented something similar for several clients.

We have an example that you can look into: NVIDIA GTC 2020: How to build a multi-camera Media Server for AI processing on Jetson. It uses interpipes and GstD to run a media server with DeepStream processing. You can easily implement the switching functionality based on this example.

  1. Add a videotestsrc pipeline for each source with the same resolution and format. The description would look something like this:
videotestsrc ! "video/x-raw,width=1920,height=1080" ! nvvideoconvert ! "video/x-raw,format=RGBA" ! interpipesink sync=false async=false name=dummysink0"
  1. Listen to the GstBus on each source pipeline to filter errors and disconnections. GstD has the bus_filter and bus_read commands for reading the bus. You might also want to use bus_timeout to set a timeout because bus_read is a blocking call.
gstd_client.bus_filter(interpipesink0_pipeline_name, "eos+warning+error+info+state_changed")
result = gstd_client.bus_read(interpipesink0_pipeline_name)
if `error` in result:
  # switch sources
  1. Switch sources on the nvstreammux pipeline when you detect a disconnection. You only need to set the listen-to property on the corresponding interpipesrc.

The pipeline description for the listener would look something like this:

interpipesrc name=interpipesrc0 is-live=true allow-renegotiation=true stream-sync=2 listen-to=interpipesink0 ! \
mux.sink_0 \
interpipesrc name=interpipesrc1 is-live=true allow-renegotiation=true stream-sync=2 listen-to=interpipesink1 ! \
mux.sink_1 \
nvstreammux name=mux batch-size=2 width=1920 height=1080 !  \
queue leaky=2 max-size-buffers=10 ! nvvideoconvert ! \
...

The switching is performed with element_set

gstd_client.element_set(nvstreammux_pipeline_name, interpipesrc0, "listen-to", "dummysink0")
  1. You can try to recover the RTSP pipeline by either stopping and playing it, or waiting in another thread until the RTSP stream comes back online to switch pipelines back.

@miguel.taylor Thanks for giving me this clue. i’m trying to build a similar pipeline in python, but im stuck in an errormessage. here is my code :

 main_pipeline.add(interpipesrc0)
    main_pipeline.add(interpipesrc1)
    main_pipeline.add(interpipesrc2)
    main_pipeline.add(streammux)
    main_pipeline.add(vid_converter)
    main_pipeline.add(filter1)
    main_pipeline.add(img_enc)
    main_pipeline.add(sink)
    #link elements ? 
    interpipesrc0.link(streammux.get_request_pad("sink_0"))
    interpipesrc1.link(streammux.get_request_pad("sink_1"))
    interpipesrc2.link(streammux.get_request_pad("sink_2"))

The Error_Messaage is :

In INTERPIPESRC-Liste : [<__gi__.GstInterPipeSrc object at 0xffff84285300 (GstInterPipeSrc at 0x3665f990)>, <__gi__.GstInterPipeSrc object at 0xffff84285380 (GstInterPipeSrc at 0x3665fe10)>, <__gi__.GstInterPipeSrc object at 0xffff842853c0 (GstInterPipeSrc at 0x36660290)>]
Traceback (most recent call last):
  File "test_pipeline_copy.py", line 293, in <module>
    main(sys.argv)
  File "test_pipeline_copy.py", line 247, in main
    create_main_pipeline(main_pipeline)   
  File "test_pipeline_copy.py", line 210, in create_main_pipeline
    interpipesrc0.link(streammux.get_request_pad("sink_0"))
TypeError: argument dest: Expected Gst.Element, but got __gi__.GstNvStreamPad

EDIT: i figured out the error.

 queue1.get_static_pad("src").link(streammux.get_request_pad("sink_0"))
    queue2.get_static_pad("src").link(streammux.get_request_pad("sink_1"))
    queue3.get_static_pad("src").link(streammux.get_request_pad("sink_2"))

did the trick.

1 Like

Hi @miguel.taylor ,

im currently stuck in getting an output after setting the streammuxxer in.

Main_pipeline:

def create_main_pipeline(main_pipeline):

    interpipesrc0 = create_gst_ele("interpipesrc", "interpipesrc_0")

    interpipesrc1 = create_gst_ele("interpipesrc", "interpipesrc_1")
    
    interpipesrc2 = create_gst_ele("interpipesrc", "interpipesrc_2")
    #interpipesrc camera 1
    interpipesrc0.set_property("listen-to", "sink0")
    interpipesrc0.set_property("is-live", True)
    interpipesrc0.set_property("stream-sync", 1)
    interpipesrc0.set_property("emit-signals", True)
    interpipesrc0.set_property("allow-renegotiation", True)
    interpipesrc0.set_property("do-timestamp", True)
    #interpipesrc camera 2
    interpipesrc1.set_property("listen-to", "sink1")
    interpipesrc1.set_property("is-live", True)
    interpipesrc1.set_property("stream-sync", 1)
    interpipesrc1.set_property("emit-signals", True)
    interpipesrc1.set_property("allow-renegotiation", True)
    interpipesrc1.set_property("do-timestamp", True)
    #interpipesrc camera 3
    interpipesrc2.set_property("listen-to", "sink2")
    interpipesrc2.set_property("is-live", True)
    interpipesrc2.set_property("stream-sync", 1)
    interpipesrc2.set_property("emit-signals", True)
    interpipesrc2.set_property("allow-renegotiation", True)
    interpipesrc2.set_property("do-timestamp", True)
    #Creating Streammux 
    streammux = create_gst_ele("nvstreammux","mux")
    if not streammux:
        sys.stderr.write(" Unable to create NvStreamMux \n")
    streammux.set_property('width', 960)
    streammux.set_property('height', 544)
    streammux.set_property('batch-size', 3)
    streammux.set_property('batched-push-timeout', 40000)
    # Creating Queue after Streammux
    queue1=create_gst_ele("queue","q1")
    queue1.set_property("max-size-buffers",10)
    queue1.set_property("leaky",2)
    #Creating vid_conv
    vid_converter = create_gst_ele("nvvideoconvert", f"main_convertor")
    # capsfilter after streammux
    filter1 = create_gst_ele("capsfilter", f"filter")
    filter1.set_property("caps", Gst.Caps.from_string("video/x-raw, format=RGB"))

    encoder = create_gst_ele("avenc_mpeg4", "encoder")
    if not encoder:
        sys.stderr.write(" Unable to create encoder \n")

    encoder.set_property("bitrate", 2000000)
    
    
    print("Creating Code Parser \n")   
    codeparser = create_gst_ele("mpeg4videoparse", "mpeg4-parser")
    if not codeparser:
        sys.stderr.write(" Unable to create code parser \n")

    print("Creating Container \n")
    container = create_gst_ele("qtmux", "qtmux")
    if not container:
        sys.stderr.write(" Unable to create code parser \n")

    print("Creating Sink \n")
    sink = create_gst_ele("filesink", "filesink")
    if not sink:
        sys.stderr.write(" Unable to create file sink \n")

    sink.set_property("location", "./out.mp4")
    sink.set_property("sync", 1)
    sink.set_property("async", 0)


   # Path(output_path).parent.mkdir(parents=True,exist_ok=True)
    main_pipeline.add(queue1)
    main_pipeline.add(interpipesrc0)
    main_pipeline.add(interpipesrc1)
    main_pipeline.add(interpipesrc2)
    main_pipeline.add(streammux)
    main_pipeline.add(vid_converter)
    main_pipeline.add(filter1)
    main_pipeline.add(encoder)
    main_pipeline.add(sink)
    main_pipeline.add(codeparser)
    main_pipeline.add(container)
 
    interpipesrc0.get_static_pad("src").link(streammux.get_request_pad("sink_0"))
    interpipesrc1.get_static_pad("src").link(streammux.get_request_pad("sink_1"))
    interpipesrc2.get_static_pad("src").link(streammux.get_request_pad("sink_2"))
    streammux.link(queue1)
    queue1.link(vid_converter)
    vid_converter.link(filter1)
    filter1.link(encoder)
    encoder.link(codeparser)
    codeparser.link(container)
    container.link(sink)
    INTERPIPESRC.append(interpipesrc0)
    INTERPIPESRC.append(interpipesrc1)
    INTERPIPESRC.append(interpipesrc2)

main_function:

def main(args):
    os.environ['GST_DEBUG'] = "3"
    if len(args) < 2:
        logger.warning("usage: %s <rtsp_url_1> [rtsp_url_2] ... [rtsp_url_N]\n" % args[0])
        sys.exit()

    global perf_data
    number_sources = len(args) - 1
    sources_urls = args[1:]

    # Standard GStreamer initialization
    Gst.init(None)
    logger.debug("Creating Pipeline \n ")

    global dummy_pipeline, main_pipeline
    main_pipeline = Gst.Pipeline()
    if not main_pipeline:
        logger.error("Unable to create Pipeline")
    dummy_pipeline = Gst.Pipeline.new(f"dummy-pipeline")
    if not dummy_pipeline:
        logger.error("Unable to create dummy-pipeline")


    logger.debug("Creating element \n ")
    for index, stream_url in enumerate(sources_urls):
        create_source_pipeline(index, stream_url)
        logger.debug(f"SRC_Pipeline {index} created")
        create_dummy_pipeline(index, dummy_pipeline)
        logger.debug(f"DUMMY_Pipeline {index} created")
    logger.debug ("Creating Main_Pipeline")
    create_main_pipeline(main_pipeline)   
    logger.debug ("Main_Pipeline created successfully")

    # create an event loop and feed gstreamer bus mesages to it
    loop = GLib.MainLoop()
    for src_pipeline in SRC_PIPELINES:
        src_bus = src_pipeline.get_bus()
        src_bus.add_signal_watch()
        src_bus.connect("message", test_utils.bus_call_src_pipeline, loop, src_pipeline)


    bus = main_pipeline.get_bus()
    bus.add_signal_watch()
    bus.connect("message", test_utils.bus_call, loop, main_pipeline)

    logger.debug("Starting pipeline \n")
    # start play back and listed to events
    dummy_pipeline.set_state(Gst.State.PLAYING)
    for src_pipeline1 in SRC_PIPELINES:
        src_pipeline1.set_state(Gst.State.PLAYING)
    time.sleep(number_sources)
    main_pipeline.set_state(Gst.State.PLAYING)
    try:
        loop.run()
    except:
        pass

Do you have any idea why - after the streammux - there is no output ?
When using the “template” , im getting results, so im guessing the interpipes are not the problem. Maybe it is a general problem ?

The main issue I noticed is that the encoder you are using doesn’t support the input format you are giving it.

From the avenc_mpeg4 inspect:

  SINK template: 'sink'
    Availability: Always
    Capabilities:
      video/x-raw
                 format: I420

Removing the capsfilter and letting the pipeline figure out the caps would solve this issue.

@miguel.taylor i tried like you said, but the outputfile is still 0 MB big, and i cannot play it. so i think there has to be another error.

i tried to figure out if there is any suspicious output from debug log. but the only thing that looks so is this part:

0:00:01.265641618 e[332m1163041e[00m 0xffff38007d20 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:3184:gst_video_decoder_clip_and_push_buf:<av_decoder>e[00m First buffer since flush took 0:00:00.470819504 to produce
0:00:01.280475015 e[332m1163041e[00m      0x310c980 e[36mINFO   e[00m e[00;01;34m           GST_EVENT gstevent.c:820:gst_event_new_caps:e[00m creating caps event video/x-raw, format=(string)I420, width=(int)2592, height=(int)1944, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt709, framerate=(fraction)15/1
0:00:01.285727251 e[332m1163041e[00m      0x310c980 e[36mINFO   e[00m e[00;01;31;44m            GST_PADS gstpad.c:4237:gst_pad_peer_query:<src_videoscaler_1:src>e[00m pad has no peer
0:00:01.294368588 e[332m1163041e[00m      0x310c980 e[36mINFO   e[00m e[00m       basetransform gstbasetransform.c:1317:gst_base_transform_setcaps:<src_convertor1>e[00m reuse caps
0:00:01.294779072 e[332m1163041e[00m      0x310c980 e[36mINFO   e[00m e[00;01;34m           GST_EVENT gstevent.c:820:gst_event_new_caps:e[00m creating caps event video/x-raw, format=(string)I420, width=(int)2592, height=(int)1944, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt709, framerate=(fraction)15/1
0:00:01.295105553 e[332m1163041e[00m      0x310c980 e[36mINFO   e[00m e[00m       basetransform gstbasetransform.c:1317:gst_base_transform_setcaps:<src_videoscaler_1>e[00m reuse caps
0:00:01.295250841 e[332m1163041e[00m      0x310c980 e[36mINFO   e[00m e[00;01;34m           GST_EVENT gstevent.c:820:gst_event_new_caps:e[00m creating caps event video/x-raw, format=(string)I420, width=(int)2592, height=(int)1944, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt709, framerate=(fraction)15/1
0:00:01.295434178 e[332m1163041e[00m      0x310c980 e[36mINFO   e[00m e[00;01;31;44m            GST_PADS gstpad.c:4237:gst_pad_peer_query:<src_videoscaler_1:src>e[00m pad has no peer
0:00:01.303204750 e[332m1163041e[00m      0x310c980 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:3184:gst_video_decoder_clip_and_push_buf:<av_decoder>e[00m First buffer since flush took 0:00:00.522186765 to produce
0:00:01.683109217 e[332m1163041e[00m 0xffff38007d20 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)339360, maximum-bitrate=(uint)339360, bitrate=(uint)1935000;
0:00:01.683348782 e[332m1163041e[00m      0x310c800 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)339360, maximum-bitrate=(uint)339360, bitrate=(uint)1935000;
0:00:01.720706975 e[332m1163041e[00m      0x310c980 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)1064760, maximum-bitrate=(uint)1064760, bitrate=(uint)1747956;
0:00:01.764452183 e[332m1163041e[00m      0x310c800 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)339360, maximum-bitrate=(uint)549360, bitrate=(uint)1809032;
0:00:01.764542459 e[332m1163041e[00m 0xffff38007d20 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)339360, maximum-bitrate=(uint)549360, bitrate=(uint)1809032;
0:00:01.803895155 e[332m1163041e[00m 0xffff38007d20 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)339360, maximum-bitrate=(uint)564960, bitrate=(uint)1705360;
0:00:01.804191650 e[332m1163041e[00m      0x310c800 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)339360, maximum-bitrate=(uint)564960, bitrate=(uint)1705360;
0:00:01.809720732 e[332m1163041e[00m      0x310c980 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)1053960, maximum-bitrate=(uint)1064760, bitrate=(uint)1684865;
0:00:01.840861840 e[332m1163041e[00m      0x310c980 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)904440, maximum-bitrate=(uint)1064760, bitrate=(uint)1619830;
0:00:01.883968263 e[332m1163041e[00m 0xffff38007d20 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)339360, maximum-bitrate=(uint)946800, bitrate=(uint)1647009;
0:00:01.884518403 e[332m1163041e[00m      0x310c800 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)339360, maximum-bitrate=(uint)946800, bitrate=(uint)1647009;
0:00:01.920286948 e[332m1163041e[00m      0x310c980 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)904440, maximum-bitrate=(uint)1064760, bitrate=(uint)1576855;
0:00:01.964133696 e[332m1163041e[00m 0xffff38007d20 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)339360, maximum-bitrate=(uint)946800, bitrate=(uint)1582500;
0:00:01.964968907 e[332m1163041e[00m      0x310c800 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)339360, maximum-bitrate=(uint)946800, bitrate=(uint)1582500;
0:00:02.001059004 e[332m1163041e[00m      0x310c980 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)904440, maximum-bitrate=(uint)1064760, bitrate=(uint)1533891;
0:00:02.003280781 e[332m1163041e[00m 0xffff38007d20 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)339360, maximum-bitrate=(uint)946800, bitrate=(uint)1501800;
0:00:02.003715780 e[332m1163041e[00m      0x310c800 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)339360, maximum-bitrate=(uint)946800, bitrate=(uint)1501800;
0:00:02.039887577 e[332m1163041e[00m      0x310c980 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)862440, maximum-bitrate=(uint)1064760, bitrate=(uint)1489128;
0:00:02.083664338 e[332m1163041e[00m 0xffff38007d20 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)339360, maximum-bitrate=(uint)946800, bitrate=(uint)1444050;
0:00:02.084134922 e[332m1163041e[00m      0x310c800 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)339360, maximum-bitrate=(uint)946800, bitrate=(uint)1444050;
0:00:02.121283729 e[332m1163041e[00m      0x310c980 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)862440, maximum-bitrate=(uint)1064760, bitrate=(uint)1459822;
0:00:02.163723782 e[332m1163041e[00m 0xffff38007d20 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)339360, maximum-bitrate=(uint)946800, bitrate=(uint)1394308;
0:00:02.164267393 e[332m1163041e[00m      0x310c800 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)339360, maximum-bitrate=(uint)946800, bitrate=(uint)1394308;
0:00:02.203690172 e[332m1163041e[00m 0xffff38007d20 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)339360, maximum-bitrate=(uint)946800, bitrate=(uint)1337526;
0:00:02.204068240 e[332m1163041e[00m      0x310c800 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)339360, maximum-bitrate=(uint)946800, bitrate=(uint)1337526;
0:00:02.239745099 e[332m1163041e[00m      0x310c980 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)847920, maximum-bitrate=(uint)1064760, bitrate=(uint)1399660;
0:00:02.283248055 e[332m1163041e[00m 0xffff38007d20 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)339360, maximum-bitrate=(uint)946800, bitrate=(uint)1291957;
0:00:02.283568263 e[332m1163041e[00m      0x310c800 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)339360, maximum-bitrate=(uint)946800, bitrate=(uint)1291957;
0:00:02.363495700 e[332m1163041e[00m 0xffff38007d20 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)339360, maximum-bitrate=(uint)946800, bitrate=(uint)1253304;
0:00:02.363980109 e[332m1163041e[00m      0x310c800 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)339360, maximum-bitrate=(uint)946800, bitrate=(uint)1253304;
0:00:02.400838885 e[332m1163041e[00m      0x310c980 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)847920, maximum-bitrate=(uint)1064760, bitrate=(uint)1358736;
0:00:02.403921954 e[332m1163041e[00m 0xffff38007d20 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)339360, maximum-bitrate=(uint)946800, bitrate=(uint)1211560;
0:00:02.404167151 e[332m1163041e[00m      0x310c800 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)339360, maximum-bitrate=(uint)946800, bitrate=(uint)1211560;
0:00:02.483289651 e[332m1163041e[00m 0xffff38007d20 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)339360, maximum-bitrate=(uint)946800, bitrate=(uint)1179354;
0:00:02.483646277 e[332m1163041e[00m      0x310c800 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)339360, maximum-bitrate=(uint)946800, bitrate=(uint)1179354;
0:00:02.520145387 e[332m1163041e[00m      0x310c980 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)847920, maximum-bitrate=(uint)1064760, bitrate=(uint)1319230;
0:00:02.563851616 e[332m1163041e[00m 0xffff38007d20 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)339360, maximum-bitrate=(uint)946800, bitrate=(uint)1151358;
0:00:02.564263477 e[332m1163041e[00m      0x310c800 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)339360, maximum-bitrate=(uint)946800, bitrate=(uint)1151358;
0:00:02.603597964 e[332m1163041e[00m 0xffff38007d20 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)333840, maximum-bitrate=(uint)946800, bitrate=(uint)1117295;
0:00:02.603818711 e[332m1163041e[00m      0x310c800 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)333840, maximum-bitrate=(uint)946800, bitrate=(uint)1117295;
0:00:02.640129619 e[332m1163041e[00m      0x310c980 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)847920, maximum-bitrate=(uint)1064760, bitrate=(uint)1286180;
0:00:02.684945665 e[332m1163041e[00m 0xffff38007d20 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)333840, maximum-bitrate=(uint)946800, bitrate=(uint)1092667;
0:00:02.685497437 e[332m1163041e[00m      0x310c800 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)333840, maximum-bitrate=(uint)946800, bitrate=(uint)1092667;
0:00:02.763615822 e[332m1163041e[00m 0xffff38007d20 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)333840, maximum-bitrate=(uint)946800, bitrate=(uint)1070880;
0:00:02.764126504 e[332m1163041e[00m      0x310c800 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)333840, maximum-bitrate=(uint)946800, bitrate=(uint)1070880;
0:00:02.799883080 e[332m1163041e[00m      0x310c980 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)847920, maximum-bitrate=(uint)1064760, bitrate=(uint)1260106;
0:00:02.803717228 e[332m1163041e[00m 0xffff38007d20 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)333840, maximum-bitrate=(uint)946800, bitrate=(uint)1043782;
0:00:02.803940727 e[332m1163041e[00m      0x310c800 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)333840, maximum-bitrate=(uint)946800, bitrate=(uint)1043782;
0:00:02.840790895 e[332m1163041e[00m      0x310c980 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)806040, maximum-bitrate=(uint)1064760, bitrate=(uint)1243288;
0:00:02.964107489 e[332m1163041e[00m 0xffff38007d20 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)333840, maximum-bitrate=(uint)946800, bitrate=(uint)1007880;
0:00:02.964703423 e[332m1163041e[00m      0x310c800 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)333840, maximum-bitrate=(uint)946800, bitrate=(uint)1007880;
0:00:03.003332274 e[332m1163041e[00m 0xffff38007d20 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)333840, maximum-bitrate=(uint)946800, bitrate=(uint)985520;
0:00:03.003722534 e[332m1163041e[00m      0x310c800 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)333840, maximum-bitrate=(uint)946800, bitrate=(uint)985520;
0:00:03.040165289 e[332m1163041e[00m      0x310c980 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)806040, maximum-bitrate=(uint)1064760, bitrate=(uint)1213412;
0:00:03.120778841 e[332m1163041e[00m      0x310c980 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)806040, maximum-bitrate=(uint)1631040, bitrate=(uint)1226883;
0:00:03.142811965 e[332m1163041e[00m 0xffff300031e0 e[36mINFO   e[00m e[00;01;34m           GST_EVENT gstevent.c:820:gst_event_new_caps:e[00m creating caps event application/x-rtcp
0:00:03.143247859 e[332m1163041e[00m 0xffff300031e0 e[36mINFO   e[00m e[00;01;34m           GST_EVENT gstevent.c:900:gst_event_new_segment:e[00m creating segment event time segment start=0:00:00.000000000, offset=0:00:00.000000000, stop=99:99:99.999999999, rate=1.000000, applied_rate=1.000000, flags=0x00, time=0:00:00.000000000, base=0:00:00.000000000, position 0:00:00.000000000, duration 99:99:99.999999999
0:00:03.173674435 e[332m1163041e[00m 0xffff38007d20 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)333840, maximum-bitrate=(uint)8664600, bitrate=(uint)1210785;
0:00:03.179373542 e[332m1163041e[00m      0x310c800 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)333840, maximum-bitrate=(uint)8664600, bitrate=(uint)1210785;
0:00:03.284501080 e[332m1163041e[00m 0xffff38007d20 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)333840, maximum-bitrate=(uint)8664600, bitrate=(uint)1177245;
0:00:03.284535034 e[332m1163041e[00m      0x310c800 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)333840, maximum-bitrate=(uint)8664600, bitrate=(uint)1177245;
0:00:03.404084748 e[332m1163041e[00m 0xffff38007d20 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)333840, maximum-bitrate=(uint)8664600, bitrate=(uint)1137236;
0:00:03.404274102 e[332m1163041e[00m      0x310c800 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)333840, maximum-bitrate=(uint)8664600, bitrate=(uint)1137236;
0:00:03.440200702 e[332m1163041e[00m      0x310c980 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)806040, maximum-bitrate=(uint)1631040, bitrate=(uint)1195783;
0:00:03.548601144 e[332m1163041e[00m 0xffff44003aa0 e[36mINFO   e[00m e[00;01;34m           GST_EVENT gstevent.c:820:gst_event_new_caps:e[00m creating caps event application/x-rtcp
0:00:03.549073616 e[332m1163041e[00m 0xffff44003aa0 e[36mINFO   e[00m e[00;01;34m           GST_EVENT gstevent.c:900:gst_event_new_segment:e[00m creating segment event time segment start=0:00:00.000000000, offset=0:00:00.000000000, stop=99:99:99.999999999, rate=1.000000, applied_rate=1.000000, flags=0x00, time=0:00:00.000000000, base=0:00:00.000000000, position 0:00:00.000000000, duration 99:99:99.999999999
0:00:03.563760765 e[332m1163041e[00m 0xffff38007d20 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)333840, maximum-bitrate=(uint)8664600, bitrate=(uint)1104960;
0:00:03.564234325 e[332m1163041e[00m      0x310c800 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)333840, maximum-bitrate=(uint)8664600, bitrate=(uint)1104960;
0:00:03.683628927 e[332m1163041e[00m 0xffff38007d20 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)333840, maximum-bitrate=(uint)8664600, bitrate=(uint)1072971;
0:00:03.684048373 e[332m1163041e[00m      0x310c800 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)333840, maximum-bitrate=(uint)8664600, bitrate=(uint)1072971;
0:00:03.725045216 e[332m1163041e[00m 0xffff4800a300 e[36mINFO   e[00m e[00;01;34m           GST_EVENT gstevent.c:820:gst_event_new_caps:e[00m creating caps event application/x-rtcp
0:00:03.725587324 e[332m1163041e[00m 0xffff4800a300 e[36mINFO   e[00m e[00;01;34m           GST_EVENT gstevent.c:900:gst_event_new_segment:e[00m creating segment event time segment start=0:00:00.000000000, offset=0:00:00.000000000, stop=99:99:99.999999999, rate=1.000000, applied_rate=1.000000, flags=0x00, time=0:00:00.000000000, base=0:00:00.000000000, position 0:00:00.000000000, duration 99:99:99.999999999
0:00:03.801240047 e[332m1163041e[00m      0x310c980 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)806040, maximum-bitrate=(uint)1631040, bitrate=(uint)1169959;
0:00:03.805071314 e[332m1163041e[00m      0x310c800 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)333840, maximum-bitrate=(uint)8664600, bitrate=(uint)1048180;
0:00:03.805195448 e[332m1163041e[00m 0xffff38007d20 e[36mINFO   e[00m e[00m        videodecoder gstvideodecoder.c:1312:gst_video_decoder_sink_event_default:<av_decoder>e[00m upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", minimum-bitrate=(uint)333840, maximum-bitrate=(uint)8664600, bitrate=(uint)1048180;

Im tried also to change the filesink, but there is not output neither.
i atached the complete output_log onto this reply. Maybe there is a clue, but i dont get him?

GST_DEBUG.log (744.6 KB)
t

my code (py-file , zipped)
test_pipeline_copy.zip (2.8 KB)

EDIT: Changed the Debug Level on 2 is giving me this summary of events :

0:00:00.938755912 e[336m1175429e[00m     0x11209800 e[33;01mWARN   e[00m e[00m              udpsrc gstudpsrc.c:1445:gst_udpsrc_open:<udpsrc0>e[00m warning: Could not create a buffer of requested 524288 bytes (Operation not permitted). Need net.admin privilege?
0:00:00.938766088 e[336m1175429e[00m     0x1114d060 e[33;01mWARN   e[00m e[00m              udpsrc gstudpsrc.c:1445:gst_udpsrc_open:<udpsrc1>e[00m warning: Could not create a buffer of requested 524288 bytes (Operation not permitted). Need net.admin privilege?
0:00:00.939283711 e[336m1175429e[00m     0x11209800 e[33;01mWARN   e[00m e[00m              udpsrc gstudpsrc.c:1455:gst_udpsrc_open:<udpsrc0>e[00m have udp buffer of 212992 bytes while 524288 were requested
0:00:00.939369859 e[336m1175429e[00m     0x1114d060 e[33;01mWARN   e[00m e[00m              udpsrc gstudpsrc.c:1455:gst_udpsrc_open:<udpsrc1>e[00m have udp buffer of 212992 bytes while 524288 were requested
0:00:00.941051181 e[336m1175429e[00m     0x1114d060 e[33;01mWARN   e[00m e[00m              udpsrc gstudpsrc.c:1445:gst_udpsrc_open:<udpsrc3>e[00m warning: Could not create a buffer of requested 524288 bytes (Operation not permitted). Need net.admin privilege?
0:00:00.941232437 e[336m1175429e[00m     0x1114d060 e[33;01mWARN   e[00m e[00m              udpsrc gstudpsrc.c:1455:gst_udpsrc_open:<udpsrc3>e[00m have udp buffer of 212992 bytes while 524288 were requested
0:00:00.972388249 e[336m1175429e[00m     0x10fa8e40 e[33;01mWARN   e[00m e[00m              udpsrc gstudpsrc.c:1445:gst_udpsrc_open:<udpsrc5>e[00m warning: Could not create a buffer of requested 524288 bytes (Operation not permitted). Need net.admin privilege?
0:00:00.972725288 e[336m1175429e[00m     0x10fa8e40 e[33;01mWARN   e[00m e[00m              udpsrc gstudpsrc.c:1455:gst_udpsrc_open:<udpsrc5>e[00m have udp buffer of 212992 bytes while 524288 were requested
0:00:00.974484598 e[336m1175429e[00m     0x10fa8e40 e[33;01mWARN   e[00m e[00m              udpsrc gstudpsrc.c:1445:gst_udpsrc_open:<udpsrc6>e[00m warning: Could not create a buffer of requested 524288 bytes (Operation not permitted). Need net.admin privilege?
0:00:00.974733313 e[336m1175429e[00m     0x10fa8e40 e[33;01mWARN   e[00m e[00m              udpsrc gstudpsrc.c:1455:gst_udpsrc_open:<udpsrc6>e[00m have udp buffer of 212992 bytes while 524288 were requested
0:00:03.964021642 e[336m1175429e[00m 0xffff0c0bc4d0 e[33;01mWARN   e[00m e[00m            basesink gstbasesink.c:1209:gst_base_sink_query_latency:<filesink>e[00m warning: Pipeline construction is invalid, please add queues.
0:00:03.964691880 e[336m1175429e[00m 0xffff0c0bc4d0 e[33;01mWARN   e[00m e[00m            basesink gstbasesink.c:1209:gst_base_sink_query_latency:<filesink>e[00m warning: Not enough buffering available for  the processing deadline of 0:00:00.020000000, add enough queues to buffer  0:00:00.020000000 additional data. Shortening processing latency to 0:00:00.000000000.
0:00:06.015581690 e[336m1175429e[00m     0x10fa8e40 e[33;01mWARN   e[00m e[00m             rtspsrc gstrtspsrc.c:6326:gst_rtsp_src_receive_response:<rtspsrc1>e[00m receive interrupted
0:00:06.015841350 e[336m1175429e[00m     0x10fa8e40 e[33;01mWARN   e[00m e[00m             rtspsrc gstrtspsrc.c:6424:gst_rtspsrc_try_send:<rtspsrc1>e[00m receive interrupted
0:00:06.015925770 e[336m1175429e[00m     0x10fa8e40 e[33;01mWARN   e[00m e[00m             rtspsrc gstrtspsrc.c:8672:gst_rtspsrc_pause:<rtspsrc1>e[00m PAUSE interrupted

@miguel.taylor i was able to avoid this error. but unfortunately, now im getting an error from interpipes. it says :

.191996010  8857 0xffff5801ef60 FIXME               basesink gstbasesink.c:3246:gst_base_sink_default_event:<sink2> stream-start event without group-id. Consider implementing group-id handling in the upstream elements
0:00:01.192273624  8857     0x3aba1800 FIXME               basesink gstbasesink.c:3246:gst_base_sink_default_event:<sink0> stream-start event without group-id. Consider implementing group-id handling in the upstream elements
[D 231031 10:20:45 test_utils:192] New Data Stream started at source source-pipeline2
    
[D 231031 10:20:45 test_utils:192] New Data Stream started at source source-pipeline0
    
0:00:01.197675338  8857 0xffff5801ef60 ERROR          interpipesink gstinterpipesink.c:452:gst_inter_pipe_sink_get_caps:<sink2> Failed to obtain an intersection between upstream elements and listeners
0:00:01.199458980  8857     0x3aba1800 ERROR          interpipesink gstinterpipesink.c:452:gst_inter_pipe_sink_get_caps:<sink0> Failed to obtain an intersection between upstream elements and listeners
[D 231031 10:20:45 test_utils:188] LATENCY Message Structure: av_decoder
    
[D 231031 10:20:45 test_utils:188] LATENCY Message Structure: av_decoder
    
[D 231031 10:20:45 test_utils:188] LATENCY Message Structure: av_decoder
    
[D 231031 10:20:45 test_utils:188] LATENCY Message Structure: av_decoder
    
[D 231031 10:20:45 test_utils:185] Object Name: sink2, Message: <flags GST_MESSAGE_TAG of type Gst.MessageType>
[D 231031 10:20:45 test_utils:185] Object Name: sink0, Message: <flags GST_MESSAGE_TAG of type Gst.MessageType>
[D 231031 10:20:45 test_utils:185] Object Name: sink0, Message: <flags GST_MESSAGE_STATE_CHANGED of type Gst.MessageType>
[D 231031 10:20:45 test_utils:185] Object Name: source-pipeline0, Message: <flags GST_MESSAGE_ASYNC_DONE of type Gst.MessageType>
[D 231031 10:20:45 test_utils:185] Object Name: sink0, Message: <flags GST_MESSAGE_STATE_CHANGED of type Gst.MessageType>
[D 231031 10:20:45 test_utils:185] Object Name: source-pipeline0, Message: <flags GST_MESSAGE_STATE_CHANGED of type Gst.MessageType>
[D 231031 10:20:45 test_utils:185] Object Name: sink2, Message: <flags GST_MESSAGE_STATE_CHANGED of type Gst.MessageType>
[D 231031 10:20:45 test_utils:185] Object Name: source-pipeline2, Message: <flags GST_MESSAGE_ASYNC_DONE of type Gst.MessageType>
[D 231031 10:20:45 test_utils:185] Object Name: sink2, Message: <flags GST_MESSAGE_STATE_CHANGED of type Gst.MessageType>
[D 231031 10:20:45 test_utils:185] Object Name: source-pipeline2, Message: <flags GST_MESSAGE_STATE_CHANGED of type Gst.MessageType>
[D 231031 10:20:46 test_utils:185] Object Name: sink0, Message: <flags GST_MESSAGE_TAG of type Gst.MessageType>
[D 231031 10:20:46 test_utils:185] Object Name: sink2, Message: <flags GST_MESSAGE_TAG of type Gst.MessageType>
0:00:03.032931619  8857     0x3aba1980 FIXME               basesink gstbasesink.c:3246:gst_base_sink_default_event:<sink1> stream-start event without group-id. Consider implementing group-id handling in the upstream elements
[D 231031 10:20:47 test_utils:192] New Data Stream started at source source-pipeline1
    
0:00:03.039297030  8857     0x3aba1980 ERROR          interpipesink gstinterpipesink.c:452:gst_inter_pipe_sink_get_caps:<sink1> Failed to obtain an intersection between upstream elements and listeners
[D 231031 10:20:47 test_utils:188] LATENCY Message Structure: av_decoder
    
[D 231031 10:20:47 test_utils:188] LATENCY Message Structure: av_decoder
    

my Main Pipeline looks now as follows:

 #creating the interpipe SRC's
    interpipesrc0 = create_gst_ele("interpipesrc", "interpipesrc_0")
    interpipesrc1 = create_gst_ele("interpipesrc", "interpipesrc_1")
    interpipesrc2 = create_gst_ele("interpipesrc", "interpipesrc_2")
    #interpipesrc camera 1
    interpipesrc0.set_property("listen-to", "sink0")
    interpipesrc0.set_property("is-live", True)
    interpipesrc0.set_property("stream-sync", 0)
    interpipesrc0.set_property("emit-signals", True)
    interpipesrc0.set_property("allow-renegotiation", True)
    interpipesrc0.set_property("do-timestamp", True)
    #interpipesrc camera 2
    interpipesrc1.set_property("listen-to", "sink1")
    interpipesrc1.set_property("is-live", True)
    interpipesrc1.set_property("stream-sync", 0)
    interpipesrc1.set_property("emit-signals", True)
    interpipesrc1.set_property("allow-renegotiation", True)
    interpipesrc1.set_property("do-timestamp", True)
    #interpipesrc camera 3
    interpipesrc2.set_property("listen-to", "sink2")
    interpipesrc2.set_property("is-live", True)
    interpipesrc2.set_property("stream-sync", 0)
    interpipesrc2.set_property("emit-signals", True)
    interpipesrc2.set_property("allow-renegotiation", True)
    interpipesrc2.set_property("do-timestamp", True)
    #testing a few options.
    vid_converter_src1 = create_gst_ele("nvvidconv", f"src_convertor_main1")
    vid_converter_src2 = create_gst_ele("nvvidconv", f"src_convertor_main2")
    vid_converter_src3 = create_gst_ele("nvvidconv", f"src_convertor_main3")
    filter_src1= create_gst_ele("capsfilter", f"filter_src1")
    filter_src1.set_property("caps", Gst.Caps.from_string("video/x-raw(memory:NVMM),width=960,height=544,format=I420"))
    filter_src2 = create_gst_ele("capsfilter", f"filter_src2")
    filter_src2.set_property("caps", Gst.Caps.from_string("video/x-raw(memory:NVMM),width=960,height=544,format=I420"))
    filter_src3 = create_gst_ele("capsfilter", f"filter_src3")
    filter_src3.set_property("caps", Gst.Caps.from_string("video/x-raw(memory:NVMM),width=960,height=544,format=I420"))
    #Creating Streammux
    streammux = create_gst_ele("nvstreammux","mux")
    if not streammux:
        sys.stderr.write(" Unable to create NvStreamMux \n")
    streammux.set_property('width', 960)
    streammux.set_property('height', 544)
    streammux.set_property('batch-size', 3)
    streammux.set_property('live-source', True)
    streammux.set_property('batched-push-timeout', 40000)
    
    # adding SRC PADS to see where is an output or not 
    interpipesrc0_srcpad = interpipesrc0.get_static_pad("src")
    if not interpipesrc0_srcpad:
        logger.error("Unable to create src pad\n")
    interpipesrc0_srcpad.add_probe(
        Gst.PadProbeType.BUFFER,
        test_function
    )
    mux_srcpad = streammux.get_static_pad("src")
    if not mux_srcpad:
        logger.error("Unable to create src pad\n")
    mux_srcpad.add_probe(
        Gst.PadProbeType.BUFFER,
        test_function
    )
    # Creating Queue after Streammux
    queue1=create_gst_ele("queue","q1")
    queue1.set_property("leaky",2)
    #Creating vid_conv
    vid_converter = create_gst_ele("nvvideoconvert", f"main_convertor")
    # capsfilter after streammux ___skipped in pipeline Link due to message from miguel.taylor
    filter1 = create_gst_ele("capsfilter", f"filter")
    filter1.set_property("caps", Gst.Caps.from_string("video/x-raw, format=I420"))
    # encoder after capsfilter
    encoder = create_gst_ele("avenc_mpeg4", "encoder")
    if not encoder:
        sys.stderr.write(" Unable to create encoder \n")

    #encoder.set_property("bitrate", 2000000)
    
    # creating the queues due to the errormessage "not enough queue"
    queue_filesink = create_gst_ele("queue","filesink_queue")
    queue_filesink.set_property("leaky",2)
    
    queue_filesink2 = create_gst_ele("queue","filesink_queue2")
    queue_filesink2.set_property("leaky",2)
    
    queue_filesink3 = create_gst_ele("queue","filesink_queue3")

    # Codeparser 
    print("Creating Code Parser \n")   
    codeparser = create_gst_ele("mpeg4videoparse", "mpeg4-parser")
    if not codeparser:
        sys.stderr.write(" Unable to create code parser \n")
    # container (muxer)
    print("Creating Container \n")
    container = create_gst_ele("qtmux", "qtmux")
    if not container:
        sys.stderr.write(" Unable to create code parser \n")
    # Sink
    print("Creating Sink \n")
    sink = create_gst_ele("filesink", "filesink")
    if not sink:
        sys.stderr.write(" Unable to create file sink \n")

    sink.set_property("location", "./out.mp4")
    sink.set_property("sync", 1)
    sink.set_property("async", 1)
    sink.set_property("processing-deadline", 200000000)

    # ADDING
    main_pipeline.add(queue1)
    main_pipeline.add(interpipesrc0)
    main_pipeline.add(interpipesrc1)
    main_pipeline.add(interpipesrc2)
    main_pipeline.add(streammux)
    main_pipeline.add(vid_converter)
    main_pipeline.add(vid_converter_src1)
    main_pipeline.add(vid_converter_src2)
    main_pipeline.add(vid_converter_src3)
    main_pipeline.add(filter_src1)
    main_pipeline.add(filter_src2)
    main_pipeline.add(filter_src3)
    

    #main_pipeline.add(filter1)
    logger.debug ("Skipped filter 1 in Main Pipeline due to Miguel Taylor")
    main_pipeline.add(encoder)
    main_pipeline.add(sink)
    main_pipeline.add(codeparser)
    main_pipeline.add(container)
    # adding filesink queues for latency error?
    main_pipeline.add(queue_filesink)
    main_pipeline.add(queue_filesink2)
    main_pipeline.add(queue_filesink3)
    
    #add options
    interpipesrc0.link(filter_src1)
    interpipesrc1.link(filter_src2)
    interpipesrc2.link(filter_src3)
    #vid_converter_src1.link(filter_src1)
    #vid_converter_src2.link(filter_src2)
    #vid_converter_src3.link(filter_src3)

    #LINKING
    ##DEBUGGING
    # Interpipe src on Streammux sink
    filter_src1.get_static_pad("src").link(streammux.get_request_pad("sink_0"))
    filter_src2.get_static_pad("src").link(streammux.get_request_pad("sink_1"))
    filter_src3.get_static_pad("src").link(streammux.get_request_pad("sink_2"))
    # Streammux on queue
    streammux.link(queue1)
    # queue on vid_converter
    queue1.link(vid_converter)
    #converter on queue
    vid_converter.link(queue_filesink)
    # queue on endocoder
    queue_filesink.link(encoder)
    #encoder on codeparser
    encoder.link(codeparser)
    #codepartser on queue
    codeparser.link(queue_filesink2)
    #queue on container
    queue_filesink2.link(container)
    #container on queue
    container.link(queue_filesink3)
    # queue on filesink
    queue_filesink3.link(sink)
    
    #liste erweitern
    INTERPIPESRC.append(interpipesrc0)
    INTERPIPESRC.append(interpipesrc1)
    INTERPIPESRC.append(interpipesrc2)

Can you try with these interpipes properties?

interpipesink sync=true async=false name=...
interpipesrc is-live=true allow-renegotiation=true stream-sync=2 listen-to=...

These are the properties we use for a similar application with DeepStream.

Additionally, could you please share the interpesink part of the pipeline? This information would be helpful for reproducing the issue using gst-launch-1.0.

As a last resort, you can set the caps property in interpipesrc to hardcode the required caps.

1 Like

@miguel.taylor :
I’m using the template from your link, so the upper part of Interpipe ist like in the template.

i was able to get things running. I think the issue is somewhere at the “memory:NVMM” -property, since the nvStreammux needs this and the interpipe seems to not like this caps.

This is the “src”-Pipe:

def create_source_pipeline(index, url):
    src_pipeline = Gst.Pipeline.new(f"source-pipeline{index}")
    if not src_pipeline:
        logger.error("Unable to create source-pipeline")

    def on_rtspsrc_pad_added(r, pad):
        r.link(queue)

    rtsp_source = create_gst_ele("rtspsrc",f"rtspsrc{index}")
    rtsp_source.set_property("location", url)
    rtsp_source.set_property("do-rtsp-keep-alive", 1)
    rtsp_source.connect("pad-added", on_rtspsrc_pad_added)

    queue = create_gst_ele("queue", "queue")
    rtsp_decoder = create_gst_ele("rtph264depay", f"decoder{index}")
    h264parser = create_gst_ele("h264parse", "parser1")
    avdec_decoder = create_gst_ele("avdec_h264", "av_decoder")

    vid_converter = create_gst_ele("videoconvert", f"src_convertor{index}")
    video_scale = create_gst_ele("videoscale", f"src_videoscaler_{index}")

    filter1 = create_gst_ele("capsfilter", f"filter{index}")
    filter1.set_property("caps", Gst.Caps.from_string("video/x-raw, format=RGBA"))

    input_interpipesink = create_gst_ele("interpipesink")
    input_interpipesink.set_property("name", f"sink{index}")
    input_interpipesink.set_property("forward-eos", False)
    input_interpipesink.set_property("sync", True)
    input_interpipesink.set_property("drop", True)
    input_interpipesink.set_property("forward-events", True)


    logger.debug("Adding elements to Source Pipeline \n")
    src_pipeline.add(rtsp_source)
    src_pipeline.add(queue)
    src_pipeline.add(rtsp_decoder)
    src_pipeline.add(h264parser)
    src_pipeline.add(avdec_decoder)

    src_pipeline.add(vid_converter)
    src_pipeline.add(video_scale)
    src_pipeline.add(filter1)

    src_pipeline.add(input_interpipesink)

    logger.debug("Linking elements in the Source Pipeline \n")
    rtsp_source.link(queue)
    queue.link(rtsp_decoder)
    rtsp_decoder.link(h264parser)
    h264parser.link(avdec_decoder)

    avdec_decoder.link(vid_converter)
    vid_converter.link(video_scale)
    video_scale.link(filter1)
    filter1.link(input_interpipesink)

    INTERPIPESINK.append(input_interpipesink)
    SRC_PIPELINES.append(src_pipeline)
    VIDEO_SOURCE_TIMER.append(0)

    decoder_srcpad = rtsp_decoder.get_static_pad("src")
    if not decoder_srcpad:
        logger.error("Unable to create src pad\n")
    decoder_srcpad.add_probe(
        Gst.PadProbeType.BUFFER,
        test_utils.source_stream_pad_buffer_probe,
        VIDEO_SOURCE_TIMER,
        index,
        rtsp_decoder,
    )

and the working main pipeline:

def create_main_pipeline(main_pipeline):
    #creating the interpipe SRC's
    interpipesrc0 = create_gst_ele("interpipesrc", "interpipesrc_0")
    interpipesrc1 = create_gst_ele("interpipesrc", "interpipesrc_1")
    interpipesrc2 = create_gst_ele("interpipesrc", "interpipesrc_2")
    #interpipesrc camera 1
    interpipesrc0.set_property("listen-to", "sink0")
    interpipesrc0.set_property("is-live", True)
    interpipesrc0.set_property("stream-sync", 1)
    interpipesrc0.set_property("emit-signals", True)
    interpipesrc0.set_property("allow-renegotiation", True)
    interpipesrc0.set_property("do-timestamp", True)
    #interpipesrc camera 2
    interpipesrc1.set_property("listen-to", "sink1")
    interpipesrc1.set_property("is-live", True)
    interpipesrc1.set_property("stream-sync", 1)
    interpipesrc1.set_property("emit-signals", True)
    interpipesrc1.set_property("allow-renegotiation", True)
    interpipesrc1.set_property("do-timestamp", True)
    #interpipesrc camera 3
    interpipesrc2.set_property("listen-to", "sink2")
    interpipesrc2.set_property("is-live", True)
    interpipesrc2.set_property("stream-sync", 1)
    interpipesrc2.set_property("emit-signals", True)
    interpipesrc2.set_property("allow-renegotiation", True)
    interpipesrc2.set_property("do-timestamp", True)
    #testing a few options.
    vid_converter_src1 = create_gst_ele("nvvidconv", f"src_convertor_main1")
    vid_converter_src2 = create_gst_ele("nvvidconv", f"src_convertor_main2")
    vid_converter_src3 = create_gst_ele("nvvidconv", f"src_convertor_main3")
    filter_src1= create_gst_ele("capsfilter", f"filter_src1")
    filter_src1.set_property("caps", Gst.Caps.from_string("video/x-raw(memory:NVMM),format=I420"))
    filter_src2 = create_gst_ele("capsfilter", f"filter_src2")
    filter_src2.set_property("caps", Gst.Caps.from_string("video/x-raw(memory:NVMM),format=I420"))
    filter_src3 = create_gst_ele("capsfilter", f"filter_src3")
    filter_src3.set_property("caps", Gst.Caps.from_string("video/x-raw(memory:NVMM),format=I420"))
    #Creating Streammux
    streammux = create_gst_ele("nvstreammux","mux")
    if not streammux:
        sys.stderr.write(" Unable to create NvStreamMux \n")
    streammux.set_property('width', 960)
    streammux.set_property('height', 544)
    streammux.set_property('batch-size', 3)
    streammux.set_property('live-source', True)
    streammux.set_property('batched-push-timeout', 40000)
    
    # adding SRC PADS to see where is an output or not 
    interpipesrc0_srcpad = interpipesrc0.get_static_pad("src")
    if not interpipesrc0_srcpad:
        logger.error("Unable to create src pad\n")
    interpipesrc0_srcpad.add_probe(
        Gst.PadProbeType.BUFFER,
        test_function
    )
    mux_srcpad = streammux.get_static_pad("src")
    if not mux_srcpad:
        logger.error("Unable to create src pad\n")
    mux_srcpad.add_probe(
        Gst.PadProbeType.BUFFER,
        test_function
    )
    # Creating Queue after Streammux
    queue1=create_gst_ele("queue","q1")
    queue1.set_property("leaky",2)
    #Creating vid_conv
    vid_converter = create_gst_ele("nvvidconv", f"main_convertor")
    filter1 = create_gst_ele("capsfilter", f"filter")
    filter1.set_property("caps", Gst.Caps.from_string("video/x-raw(memory:NVMM),format=I420"))
    # encoder after capsfilter
    encoder = create_gst_ele("nvv4l2h264enc", "encoder")
    if not encoder:
        sys.stderr.write(" Unable to create encoder \n")



    print("Creating Code Parser \n")   
    print("Creating Container \n")
    container = create_gst_ele("qtmux", "qtmux")
    if not container:
        sys.stderr.write(" Unable to create code parser \n")
    # Sink
    print("Creating Sink \n")
    sink = create_gst_ele("filesink", "filesink")
    if not sink:

        sys.stderr.write(" Unable to create file sink \n")

    sink.set_property("location", "./out.mp4")
    sink.set_property("sync", 1)
    sink.set_property("async", 0)

    # ADDING
    main_pipeline.add(queue1)
    main_pipeline.add(interpipesrc0)
    main_pipeline.add(interpipesrc1)
    main_pipeline.add(interpipesrc2)
    main_pipeline.add(streammux)
    main_pipeline.add(vid_converter)
    main_pipeline.add(vid_converter_src1)
    main_pipeline.add(vid_converter_src2)
    main_pipeline.add(vid_converter_src3)
    main_pipeline.add(filter1)
    main_pipeline.add(filter_src1)
    main_pipeline.add(filter_src2)
    main_pipeline.add(filter_src3)
    main_pipeline.add(container)
    main_pipeline.add(encoder)
    main_pipeline.add(sink)

    
    
    #LINKING:
    # until Streammux : 
    interpipesrc0.link(vid_converter_src1)
    interpipesrc1.link(vid_converter_src2)
    interpipesrc2.link(vid_converter_src3)
    vid_converter_src1.link(filter_src1)
    vid_converter_src2.link(filter_src2)
    vid_converter_src3.link(filter_src3)
    filter_src1.get_static_pad("src").link(streammux.get_request_pad("sink_0"))
    filter_src2.get_static_pad("src").link(streammux.get_request_pad("sink_1"))
    filter_src3.get_static_pad("src").link(streammux.get_request_pad("sink_2"))
    streammux.link(queue1)
    queue1.link(vid_converter)
    vid_converter.link(filter1)
    filter1.link(encoder)
    encoder.link(sink)

you will see that this is a comlete mess, but maybe you do have a better idea - and maybe you know more about the NVMM Issue.

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks

About how to use the nvurisrcbin, you can refer to the deepstream_test_3.py.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.