Multistream with Streammux and NvInfer is very slow and cause Deadlock

I made a pipeline that intend to infer multiple local files.

The pipeline simple description is this:

  • The pipeline composed by 4 parts, each part is a Gst.Bin, and their numbers are follows:

    • source-bin: can be many (this pipeline demo has 2)
    • preprocess-bin: can be many, same number as source-bin
    • inference-bin: only one
    • sink-bin: only one
  • Within the preprocess-bin, a tee element is used to output 2 sources as inputs for streammux

With the pipeline shows above, I encountered a DeadLock when I add NvIfer after Streammux. The pipeline will never quit. The debug info shows that streammux is waiting for buffers and GST_CLOCK is waiting for unlock? I’m new to gstreamer and deepstream, so I dont where to go.

But, when I only use 1 srouce-bin and all others keep unchanged, there is no deadlock and pipeline runs fast.

It seems that if NvInfer’s batch-size larger than 4, the pipeline runs slow and deadlock appear. I’m confusing.
(I use PeopleNet model from NGC)


Lastly, this are my FPS test on difference cases. I run my programe in Docker Container: nvcr.io/nvidia/deepstream:6.0.1-devel

1. Use 2 source-bin and Use NvIfer (FPS slow, DeadLock here)

2. Use 2 source-bin but without NvInfer (FPS fast, No DeadLock)

3. Use 1 source-bin and Use NvInfer (FPS still fast, No DeadLock)


• Hardware Platform (GPU): GPU 3080
• DeepStream Version: 6.0.1
• TensorRT Version
• NVIDIA GPU Driver Version (valid for GPU only) : 495.29.05
• Issue Type( questions, new requirements, bugs): questions

please use deepstream-app to test multiple input sources, DeepStream Reference Application - deepstream-app — DeepStream 6.1.1 Release documentation, you only need to modify the configure file, please refer to README file。
code path: /opt/nvidia/deepstream/deepstream/sources/apps/sample_apps/deepstream-app,
cfg path: /opt/nvidia/deepstream/deepstream/samples/configs/deepstream-app

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.