How to Drop Frames to Reduce FPS at the Beginning of DeepStream Pipeline

Subject: How to Drop Frames to Reduce FPS at the Beginning of DeepStream Pipeline
Hello DeepStream Community,

Details:
I am working on a project using NVIDIA DeepStream SDK where I need to process video streams with a reduced frame rate to decrease the computational load. Specifically, I want to discard frames early in the pipeline (preferably right after the source) to halve the FPS, thus reducing the number of frames processed by downstream elements.

Current Understanding:

  • I am aware that the interval property in the primary GIE (pgie) config file can skip inference on certain frames, but this does not help reduce the load on other downstream plugins as they still receive and process these frames.
  • My goal is to discard frames at the very beginning of the pipeline to ensure that subsequent plugins only receive and process the reduced frame rate.

Requirements:

  • Drop frames as early as possible in the pipeline to minimize the processing load.

Questions:

  1. Are there any recommended plugins or methods within DeepStream to achieve frame dropping efficiently and correctly?
  2. Is it possible to achieve this using an argument like interval for nvstreammux, or should it be implemented by adding a probe to uridecodebin?
  3. Is using a GstPadProbe on the source pad of uridecodebin the best approach to discard frames early in the pipeline?

Any guidance or suggestions on how to implement this effectively would be greatly appreciated.

Thank you!

What kind of source will you use in your pipeline?

Thank you for your response.

I am using RTSP streams as the sources in my DeepStream pipeline.

Additional details:

  • DeepStream Version: 6.4
  • Using the DeepStream 6.4 Docker Image
  • Running on dGPU

Thank you!

  1. If your streams are H264 or H265 payloads, the nvv4l2decoder Gst-nvvideo4linux2 — DeepStream 6.2 Release documentation provides “skip-frames” property for dropping frames.
  2. The GStreamer videorate plugin provide frame rate converting function.
  3. The GStreamer valve plugin provides frame dropping function too

You may choose an appropriate method according to your own requirement.

Thank you for your suggestions.

Could you please let me know if there are any sample implementations or references based on DeepStream test samples (like test3) for using either the GStreamer videorate plugin for frame rate conversion or the GStreamer valve plugin for frame dropping?

Thanks in advance for your help!

They are the GStreamer plugins which are widely used in many GStreamer applications. DeepStream does not have special limitation with these plugins. You can google the usage and samples.

Thank you for your response.

I went through the videorate plugin and since reading the RTSPs in my application is exactly same as DeepStream deepstream-imagedata-multistream test application, I added this plugin to each bin in the following way and then attached the output of each videorate to nvstreammux:

for i, uri_name in enumerate(self.rtsp_urls):
    print(f"URI Name: {uri_name}")
    source_bin = SourceBinCreator.create_source_bin(i, uri_name)
    if not source_bin:
        sys.stderr.write("Unable to create source bin\n")
        return

    self.pipeline.add(source_bin)
    
    # Create videorate element
    videorate = Gst.ElementFactory.make("videorate", f"videorate-{i}")
    if not videorate:
        sys.stderr.write("Unable to create videorate\n")
        return
    # Set the max-rate property of videorate
    videorate.set_property("max-rate", max_frame_rate)
    # Add videorate to the pipeline
    self.pipeline.add(videorate)
    # Link source_bin to videorate
    source_bin_pad = source_bin.get_static_pad("src")
    videorate_pad = videorate.get_static_pad("sink")
    source_bin_pad.link(videorate_pad)
    # Link videorate to streammux
    sinkpad = self.streammux.get_request_pad(f"sink_{i}")
    if not sinkpad:
        sys.stderr.write(f"Unable to get the sink pad of streammux\n")
        return
    srcpad = videorate.get_static_pad("src")
    srcpad.link(sinkpad)

However, the application and the videorate plugin work properly with some RTSPs. But I have a strange problem where my application is not working with a few RTSPs that it previously (before adding videorate plugin) handled without any issues, and it stops at this stage:

Decodebin child added: source 

Decodebin child added: source 

Decodebin child added: source 

Decodebin child added: source 

Decodebin child added: source 

Decodebin child added: decodebin0 

Decodebin child added: decodebin1 

Decodebin child added: rtppcmadepay0 

Decodebin child added: rtpmp4gdepay0 

Decodebin child added: alawdec0 

Decodebin child added: aacparse0 

In cb_newpad

Decodebin child added: decodebin2 

Error: Decodebin did not pick nvidia decoder plugin.
Decodebin child added: decodebin3 

Decodebin child added: rtpmp4gdepay1 

Decodebin child added: rtpmp4gdepay2 

Decodebin child added: avdec_aac0 

Decodebin child added: aacparse1 

Decodebin child added: aacparse2 

Decodebin child added: avdec_aac1 

Decodebin child added: avdec_aac2 

In cb_newpad

Error: Decodebin did not pick nvidia decoder plugin.
In cb_newpad

Error: Decodebin did not pick nvidia decoder plugin.
In cb_newpad

Error: Decodebin did not pick nvidia decoder plugin.
Decodebin child added: decodebin4 

Decodebin child added: rtph265depay0 

Decodebin child added: h265parse0 

Decodebin child added: capsfilter0 

Decodebin child added: nvv4l2decoder0 

Warning: Color primaries 5 not present and will be treated BT.601
In cb_newpad

Warning: gst-stream-error-quark: not negotiated (11): ../libs/gst/base/gstbasetransform.c(1431): gst_base_transform_reconfigure_unlocked (): /GstPipeline:pipeline0/GstBin:source-bin-00/GstURIDecodeBin:uri-decode-bin/GstDecodeBin:decodebin4/GstCapsFilter:capsfilter0:
not negotiated
Decodebin child added: decodebin5 

Decodebin child added: rtph265depay1 

Decodebin child added: h265parse1 

Decodebin child added: capsfilter1 

Decodebin child added: nvv4l2decoder1 

Decodebin child added: decodebin6 

Decodebin child added: rtph265depay2 

Decodebin child added: h265parse2 

Decodebin child added: capsfilter2 

Decodebin child added: nvv4l2decoder2 

Decodebin child added: decodebin7 

Decodebin child added: rtph265depay3 

Decodebin child added: h265parse3 

Decodebin child added: capsfilter3 

Decodebin child added: nvv4l2decoder3 

Decodebin child added: decodebin8 

Decodebin child added: decodebin9 

Decodebin child added: rtppcmadepay1 

Decodebin child added: rtph264depay0 

Decodebin child added: alawdec1 

In cb_newpad

Error: Decodebin did not pick nvidia decoder plugin.
Decodebin child added: h264parse0 

Decodebin child added: capsfilter4 

Error: gst-stream-error-quark: Internal data stream error. (1): ../libs/gst/base/gstbasesrc.c(3127): gst_base_src_loop (): /GstPipeline:pipeline0/GstBin:source-bin-00/GstURIDecodeBin:uri-decode-bin/GstRTSPSrc:source/GstUDPSrc:udpsrc3:
streaming stopped, reason not-negotiated (-4)
Decodebin child added: nvv4l2decoder4 

[NvMultiObjectTracker] De-initialized
Warning: Color primaries 5 not present and will be treated BT.601
In cb_newpad

In cb_newpad

In cb_newpad

In some runs of the application after adding the videorate plugin , I also reach this error:

Error: gst-stream-error-quark: NvStreamMux does not suppport raw buffers. Use nvvideoconvert before NvStreamMux to convert to NVMM buffers (5): gstnvstreammux.cpp(1278): gst_nvstreammux_sink_event (): /GstPipeline:pipeline2/GstNvStreamMux:Stream-muxer
[NvMultiObjectTracker] De-initialized

I was wondering if I need to use any other plugin after each videorate and before nvstreammux or set any property of videorate like nvbuf-memory-type.

Can you please help me resolve this issue?

The log shows the hardware video decoder is not enabled. Please check the video format of the few RTSP streams.

Thank you for your prompt response.

I would like to clarify that when I remove the videorate plugin, my application has no problem with these RTSP streams. When I add the videorate plugin to the pipeline, while this plugin works well with some other RTSP streams, it comes up with the mentioned error with these specific RTSP streams.

I actually have several RTSP streams. With some of them, the videorate plugin comes up with the above error, and with the others, it has no problem and works correctly. When I remove the videorate plugin, my application has no problem with any RTSP streams!

Considering that my application is based on the deepstream-imagedata-multistream DeepStream Python test application on DeepStream 6.4, I am concerned that the way I am using the videorate plugin in my pipeline in the mentioned code might be incorrect. I wonder if I need to set any other properties, for example, related to CUDA unified memory.

Is it okay that I use the same functions and code structures from deepstream-imagedata-multistream test app, including create_source_bin, decodebin_child_added, and cb_newpad, while using the videorate plugin? Or should I change something and use more professional structures for reading RTSP streams while using the videorate plugin?

Your guidance on this matter would be greatly appreciated.

Thank you for your help.

You need to set proper parameters with the nvstreammux. DeepStream SDK FAQ - Intelligent Video Analytics / DeepStream SDK - NVIDIA Developer Forums

Thank you for your suggestion.

I referred to the link you mentioned and followed the guidelines for Gst-nvstreammux since I am not using the Gst-nvstreammux New. I have met all three mentioned requests in my application including:
1- Set the batch-size to the number of sources.
2- Exported the environment variable:

export NVSTREAMMUX_ADAPTIVE_BATCHING=yes

3- Set the batched-push-timeout to:

1000000 us / maximum fps among the videos

However, the issue still persists. When I add the videorate plugin to the pipeline, while this plugin works well with some RTSP streams, it comes up with the mentioned error with these specific RTSP streams.
I reffered to mentioned error which was:

Error: gst-stream-error-quark: NvStreamMux does not support raw buffers. Use nvvideoconvert before NvStreamMux to convert to NVMM buffers (5): gstnvstreammux.cpp(1278): gst_nvstreammux_sink_event (): /GstPipeline:pipeline2/GstNvStreamMux:Stream-muxer
[NvMultiObjectTracker] De-initialized

and to address this, I decided to add nvvideoconvert and capsfilter between each source_bin and the nvstreammux to meet the requirement for NVMM buffers mentined in the above error. Here is the code related to receiving the RTSPs which causes the error:


for i, uri_name in enumerate(rtsp_urls):
    print(f"URI Name: {uri_name}")
    
    # Create a new Gst.Bin
    bin_name = "source-bin-%02d" % i
    print(bin_name)
    nbin = Gst.Bin.new(bin_name)
    if not nbin:
        sys.stderr.write("Unable to create source bin\n")
        return

    # Create uridecodebin element
    uri_decode_bin = Gst.ElementFactory.make("uridecodebin", "uri-decode-bin")
    if not uri_decode_bin:
        sys.stderr.write("Unable to create uri decode bin\n")
        return
    uri_decode_bin.set_property("uri", uri_name)
    uri_decode_bin.connect("pad-added", cb_newpad, nbin)
    uri_decode_bin.connect("child-added", decodebin_child_added, nbin)

    Gst.Bin.add(nbin, uri_decode_bin)
    bin_pad = nbin.add_pad(Gst.GhostPad.new_no_target("src", Gst.PadDirection.SRC))
    if not bin_pad:
        sys.stderr.write("Failed to add ghost pad in source bin\n")
        return None
    
    self.pipeline.add(nbin)
    
    # Create videorate element
    videorate = Gst.ElementFactory.make("videorate", f"videorate-{i}")
    if not videorate:
        sys.stderr.write("Unable to create videorate\n")
        return
    videorate.set_property("max-rate", self.max_frame_rate)
    self.pipeline.add(videorate)
    
    # Create nvvidconv element
    nvvidconv = Gst.ElementFactory.make("nvvideoconvert", f"nvvidconv-{i}")
    if not nvvidconv:
        sys.stderr.write("Unable to create nvvideoconvert\n")
        return
    nvvidconv.set_property("nvbuf-memory-type", self.mem_type)
    self.pipeline.add(nvvidconv)
    
    # Create filter element
    caps = Gst.Caps.from_string("video/x-raw(memory:NVMM), format=RGBA")
    filter = Gst.ElementFactory.make("capsfilter", f"filter-{i}")
    if not filter:
        sys.stderr.write("Unable to create capsfilter\n")
        return
    filter.set_property("caps", caps)
    self.pipeline.add(filter)

    # Link source_bin to videorate
    source_bin_pad = nbin.get_static_pad("src")
    videorate_pad = videorate.get_static_pad("sink")
    source_bin_pad.link(videorate_pad)
    
    # Link videorate to nvvidconv
    videorate_src_pad = videorate.get_static_pad("src")
    nvvidconv_pad = nvvidconv.get_static_pad("sink")
    videorate_src_pad.link(nvvidconv_pad)
    
    # Link nvvidconv to filter
    nvvidconv_src_pad = nvvidconv.get_static_pad("src")
    filter_pad = filter.get_static_pad("sink")
    nvvidconv_src_pad.link(filter_pad)

    # Link filter to streammux
    sinkpad = self.streammux.get_request_pad(f"sink_{i}")
    if not sinkpad:
        sys.stderr.write(f"Unable to get the sink pad of streammux\n")
        return
    filter_src_pad = filter.get_static_pad("src")
    filter_src_pad.link(sinkpad)
    

The structure for each source bin:

nbin --> videorate --> nvvidconv --> filter --> streammux

But now, the following error occurs:

Warning: gst-stream-error-quark: not negotiated (11): ../libs/gst/base/gstbasetransform.c(1431): gst_base_transform_reconfigure_unlocked (): /GstPipeline:pipeline0/GstBin:source-bin-02/GstURIDecodeBin:uri-decode-bin/GstDecodeBin:decodebin0/GstCapsFilter:capsfilter0:
not negotiated
Error: gst-stream-error-quark: Internal data stream error. (1): ../libs/gst/base/gstbasesrc.c(3127): gst_base_src_loop (): /GstPipeline:pipeline0/GstBin:source-bin-02/GstURIDecodeBin:uri-decode-bin/GstRTSPSrc:source/GstUDPSrc:udpsrc14:
streaming stopped, reason not-negotiated (-4)
[NvMultiObjectTracker] De-initialized

The problem seems to relate to elements in the GStreamer pipeline being unable to agree on a common format after adding videorate to the pipeline for FPS tuning of sources.
Again I want to mention that when I remove the videorate plugin, my application has no problem with any RTSP streams!

Could you kindly help me resolve this issue?

Do you mean the error happens with some specific RTSP streams while the other RTSP streams work well with “videorate”?

Exactly!

As I mentioned, some specific RTSP streams work properly with this structure:

nbin --> videorate --> streammux

and some others come up with error I mentioned.

When I remove the videorate plugin, my application has no problem with any RTSP streams!

Can you show us the differences between the error RTSP streams and the correct RTSP streams? What are the video formats in the error RTSP streams?

Certainly, here are the details of the RTSP streams:

Correct RTSP Streams

The application works properly with these RTSP streams. Below are the codec information screenshots for these streams:
correct_30
correct_25
correct_22

Error RTSP Streams

The application encounters issues with these RTSP streams. Below are the codec information screenshots for these streams:
error_23
error_21

Please note that these screenshots are taken from the VLC app’s ‘Codec Information’ section.

Pipeline Configuration

The pipeline I use is structured as follows:

nbin --> videorate --> streammux

Error Occurs With Problematic RTSPs

The error that occurs with the problematic RTSPs is:

In cb_newpad

Warning: gst-stream-error-quark: not negotiated (11): ../libs/gst/base/gstbasetransform.c(1431): gst_base_transform_reconfigure_unlocked (): /GstPipeline:pipeline0/GstBin:source-bin-00/GstURIDecodeBin:uri-decode-bin/GstDecodeBin:decodebin5/GstCapsFilter:capsfilter2:
not negotiated
Error: gst-stream-error-quark: NvStreamMux does not suppport raw buffers. Use nvvideoconvert before NvStreamMux to convert to NVMM buffers (5): gstnvstreammux.cpp(1278): gst_nvstreammux_sink_event (): /GstPipeline:pipeline0/GstNvStreamMux:Stream-muxer
[NvMultiObjectTracker] De-initialized

Could you please help us identify why these particular streams are causing issues?

Can the error RTSP streams work with the following pipeline?
GST_DEBUG=capsfilter:5,v4l2videodec:5 gst-launch-1.0 uridecodebin uri=rtsp://xxxxxx ! nvvideoconvert ! videorate ! 'video/x-raw(memory:NVMM),framerate=28/1' ! fakesink

Yes, the error RTSP streams work with the provided pipeline.

However, I am still experiencing error with my deepstream app. Could you help identify the problem and suggest how I should change my pipeline?

My current pipeline:

nbin --> videorate --> streammux --> queue1 --> nvvidconv --> queue1 --> filter .... --> nvosd --> queue10 --> sink

Thank you for your assistance!

Please use the pipeline I provided before nvstreammux.

Thanks for your help. It resolved my issue.