Back-to-back-detectors multiple PGIEs

**• Hardware Platform (GPU) RTX 3060
**• DeepStream Version 7.0
• TensorRT Version
**• NVIDIA GPU Driver Version (valid for GPU only) CUDA12.1
• Issue Type (questions)

If I use the back-to-back-detectors sample program to set up 3 PGIEs, will these 3 PGIE detectors execute in parallel simultaneously, or are they sequentially processed, needing to be executed one after the other?

They will be executed in sequence in the pipeline.

So, the second PGIE needs to wait for the first PGIE to finish its execution before proceeding, and the third PGIE needs to wait for the second PGIE?

No. You can think of the plugins as running in parallel, but the frames processed are sequential. They process the data at the same time, but process different frames like below.

...pgie(frame_2)->sgie1(frame1)->sgie2(Frame_0)...

So, is it that the same image is divided into three frames (frame_2, frame_1, frame_0) for the three PGIEs to perform inference simultaneously?

No. Each image is processed sequentially in the pipeline, but each nvinfer is running in a different thread.

So, the same frame still needs to be processed by the first PGIE before it is sent to the second PGIE for inference?
If the first PGIE has not yet completed the inference of frame_0, does that mean frame_0 will not be sent to the second PGIE for inference?

Yes.

Yes.

1 Like

Thank you for your response.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.