It appears that Deepstream supports a primary inference engine, and several secondary inference engines. The secondary inference engines only work on the filtered output of the primary inference engine, if I’m not mistaken.
Is there a way to perform several parallel inference operations on the input stream (i.e. all the inference engines must work on all the input frames, not a filtered output). Is there some way to achieve this?