Deepstream pipeline

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU)
• DeepStream Version
• JetPack Version (valid for Jetson only)
• TensorRT Version
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs)
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

Hello,

I have been trying the example python apps and see a difference in the creation of the pipeline that I cannot understand. Some of the apps use additional “queue” elements as a bridge between the main elements:
streammux → queue1 → pgie → queue2 → tracker…
and other directly link the basic elements:
streammux->pgie->nvvidconv->nvosd…

What is the difference? and when should each way be used to create a pipeline?

Best regards,
Mladen

Please refer to queue (gstreamer.freedesktop.org)

DeepStream is based on gstreamer, please make sure you are familiar with gstreamer before you start DeepStream.

Ok. I got familiar with it, but still cannot understand the difference. Could you, please, explain it in terms of its usage in Deepstream and the provided examples.

Please refer to queue (gstreamer.freedesktop.org)

Is the goal of this developer’s forum to be a bit helpful to users? What is the point in repeating your reply? I read it the first time, but i didn’t get the difference. Did you understand my second reply or you need to improve your English first?

Do you get paid for this? Nvidia could totally replace you with a chatbot…

queue is just an ordinary public open source gstreamer plugin. To use it is just to make the upstream plugin src pad and downstream plugin sink pad to work in different threads to make some parts of the pipeline work asynchronously. The queue document has told us.
It is just a common component which is not provided by Nvidia. Whether to use queue or not in your pipeline depends on you. You can also discuss it with gstreamer community.

Generally,
Queues are used as buffers for inter-thread data communication.
There are two types of inter-thread data communication: thread-safe and non-thread-safe. (gstreamer is thread-safe.)
Consider using queue when the backward pipeline processing is expected to be slower than the forward processing.
However, since queues have capacity limits, you should consider dropping queues to prevent them from overflowing.
The parameter for drop is in the reference shown by @Fiona.Chen

Here is a better explanation.

Threads (gstreamer.freedesktop.org)

Above, we’ve mentioned the “queue” element several times now. A queue is the thread boundary element through which you can force the use of threads. It does so by using a classic provider/consumer model as learned in threading classes at universities all around the world. By doing this, it acts both as a means to make data throughput between threads threadsafe, and it can also act as a buffer.

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.