Adding videorate to source bin

Hello, i’m trying to add videorate to source bin as below. I have some issues with this method, because i have NvBufTransform error when i delete my source from the pipeline. Looks like i need to add videorate plugin in different way, maybe inside cd_newpad function.

My setup:
• Hardware Platform (Jetson / GPU) GeForce GTX 1060
• DeepStream Version 6.1
• JetPack Version (valid for Jetson only)
• TensorRT Version 8.6.1.6-1+cuda12.0
• NVIDIA GPU Driver Version (valid for GPU only) 535.183.01
• Issue Type( questions, new requirements, bugs) questions

The first method i tried:

def create_source_bin(index, uri) -> Gst.Bin:
    logging.debug(f'Creating source_bin "{index}"')

    # Create a source GstBin to abstract this bin's content from the rest of the
    # pipeline
    bin_name = f"source-bin-{index}"
    nbin = Gst.Bin.new(bin_name)
    if not nbin:
        logging.error("Unable to create source bin")

    # Source element for reading from the uri.
    # We will use decodebin and let it figure out the container format of the
    # stream and the codec and plug the appropriate demux and decode plugins.
    uri_decode_bin = Gst.ElementFactory.make("uridecodebin", "uri-decode-bin")
    if not uri_decode_bin:
        logging.error("Unable to create uri decode bin")
    # We set the input uri to the source element
    uri_decode_bin.set_property("uri", uri)
    # Connect to the "pad-added" signal of the decodebin which generates a
    # callback once a new pad for raw data has beed created by the decodebin
    uri_decode_bin.connect("pad-added", cb_newpad, nbin)
    uri_decode_bin.connect("child-added", decodebin_child_added, nbin)
    # We need to create a ghost pad for the source bin which will act as a proxy
    # for the video decoder src pad. The ghost pad will not have a target right
    # now. Once the decode bin creates the video decoder and generates the
    # cb_newpad callback, we will set the ghost pad target to the video decoder
    # src pad.
    Gst.Bin.add(nbin, uri_decode_bin)

    bin_pad = nbin.add_pad(Gst.GhostPad.new_no_target("src", Gst.PadDirection.SRC))
    if not bin_pad:
        logging.error("Failed to add ghost pad in source bin")
        return None

    parent_source_bin_name = f"parents-source-bin-{index}"
    parent_source_bin = Gst.Bin.new(parent_source_bin_name)
    if not parent_source_bin:
        logging.error("Unable to create source bin")

    video_rate = make_element("videorate", index)
    video_rate.set_property("max-rate", app_config.MAX_RATE)

    Gst.Bin.add(parent_source_bin, nbin)
    Gst.Bin.add(parent_source_bin, video_rate)

    nbin.link(video_rate)

    parent_bin_pad = parent_source_bin.add_pad(Gst.GhostPad.new("src", video_rate.get_static_pad("src")))
    if not parent_bin_pad:
        logging.error("Failed to add parent ghost pad in source bin")
        return None

    return parent_source_bin

The second method:

def create_source_bin(index, uri) -> Gst.Bin:
    logging.debug(f'Creating source_bin "{index}"')

    # Create a source GstBin to abstract this bin's content from the rest of the
    # pipeline
    bin_name = f"source-bin-{index}"
    nbin = Gst.Bin.new(bin_name)
    if not nbin:
        logging.error("Unable to create source bin")

    # Source element for reading from the uri.
    uri_decode_bin = Gst.ElementFactory.make("uridecodebin", "uri-decode-bin")
    if not uri_decode_bin:
        logging.error("Unable to create uri decode bin")
    
    # Set the input uri to the source element
    uri_decode_bin.set_property("uri", uri)

    video_rate = make_element("videorate", index)
    video_rate.set_property("max-rate", app_config.MAX_RATE)

    # Connect to the "pad-added" signal of the decodebin
    uri_decode_bin.connect("pad-added", cb_newpad, nbin)
    uri_decode_bin.connect("child-added", decodebin_child_added, nbin)

    # Add elements to the bin
    Gst.Bin.add(nbin, uri_decode_bin)
    Gst.Bin.add(nbin, video_rate)

    # Link uridecodebin and videorate
    if not Gst.Element.link_many(uri_decode_bin, video_rate):
        logging.error("Elements could not be linked.")

    # Create a ghost pad for the source bin
    bin_pad = nbin.add_pad(Gst.GhostPad.new_no_target("src", Gst.PadDirection.SRC))
    if not bin_pad:
        logging.error("Failed to add ghost pad in source bin")
        return None

    return nbin

The second method also has problems - the elements are not linked to each other.

What is your goal in doing this?

Deepstream will decode and output to GPU memory, but videorate is not suitable for this

Unless you copy it to the CPU as follows, but this is not recommended and usually causes performance problems

gst-launch-1.0 uridecodebin uri=file:///opt/nvidia/deepstream/deepstream/samples/streams/sample_1080p_h265.mp4 ! nvvideoconvert ! videorate ! 'video/x-raw,framerate=20/1' ! fakesink

Deepstream provides similar functionality, drop-frame-interval property of nvv4l2decoder and interval property of nvinfer.

You can view more information at gst-inspect-1.0 nvv4l2decoder or gst-inspect-1.0 nvinfer

Thanks for the answer! drop-frame-interval is not suitable for cases where I don’t know the FPS of the camera I’m adding to the pipeline. The video rate provides a strictly specified maximum frame rate for all types of cameras.

Can you share the reason for strictly specified maximum frame rate? I think this is unnecessary.

If you want to adjust nvstreammux parameters, refer to this FAQ

If it is an rtsp camera, you need to set live-source property to true

This seems to work, however now the broadcast seems laggy :(

What does this mean? Any other questions?