DeepStream configuration for multiple inference engines

It appears that Deepstream supports a primary inference engine, and several secondary inference engines. The secondary inference engines only work on the filtered output of the primary inference engine, if I’m not mistaken.

Is there a way to perform several parallel inference operations on the input stream (i.e. all the inference engines must work on all the input frames, not a filtered output). Is there some way to achieve this?


Can you give a example to show more details of what you want ?

The example that’s provided with the DeepStream release consists of :

  • a Primary inference engine to detect vehicles
  • 3 secondary inference engines that work on the detected vehicles, to identify their type, color and make.

I’m looking to use multiple inference engines, that all work on the full image. Unlike the above scenario, where only the primary inference engine works on the full image, and the secondary engines work on only the smaller parts of the image, filtered by the primary engine.

For example, I would like to use one inference engine to spot faces, another to spot vehicles, and another to spot License plates. Thus, there would be 3 primary engines. There could be secondary engines associated with each of the primary engines too.

Hope this is clear. Thanks again for responding.

Do you mean you want 3 detections (Primary inference) running in parallel ?

Why do you training a network with 3 classes (face, vehicle, LP) ?

Yes, I wanted 3 primary inference engines running in parallel.

I guess I could train a network with 3 separate classes, like you suggest.

However, the advantage of 3 separate detectors is - plug and play. I could create a pipeline with 3 detectors that are already trained on each type, and also later expand to add a 4th type when needed. All it involves is modifying the DeepStream configuration file.

Dear @prashanth.bhat ,

Did you find a solution for having more than one detectors running parallely?
I have the same problem.

Dear @hassan.imani1987,
No, I never found a solution for this scenario.

Good luck.

Any progress.