**• Hardware Platform (GPU) RTX 3060
**• DeepStream Version 7.0 • TensorRT Version
**• NVIDIA GPU Driver Version (valid for GPU only) CUDA12.1 • Issue Type (questions)
If I use the back-to-back-detectors sample program to set up 3 PGIEs, will these 3 PGIE detectors execute in parallel simultaneously, or are they sequentially processed, needing to be executed one after the other?
No. You can think of the plugins as running in parallel, but the frames processed are sequential. They process the data at the same time, but process different frames like below.
So, the same frame still needs to be processed by the first PGIE before it is sent to the second PGIE for inference?
If the first PGIE has not yet completed the inference of frame_0, does that mean frame_0 will not be sent to the second PGIE for inference?