Using secondary-gie metadata with message-broker

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) Jetson
• DeepStream Version 5.0

I used the reference deepstrem-app to construct a pipeline with a primary-gie and two secondary-gie elements.

From the pipeline graph, the src pad of secondary_gie_bin is the src pad of secondary_gie_bin_queue, which is to say the output of the two secondary NvInfer elements is not connected to the rest of the pipeline (they are connected to fake sinks).

How can I use the reference app to take metadata from secondary-gie elements and send over message broker?

Hey, seems you need to customize the reference app to support it, in addition, may I know why you need to run the sgies in parallel?

I just made this example pipeline to test the sgie functionality. The deepstream-test2 app has multiple sgie elements in parallel, no?

@bcao Just to clarify, the deepstream reference app does not support the use of sgie whatsoever. Is this correct ?

For deepstream-test2, all the sgies are cascaded, that means the pipeline like

pgie → tracker → sgie0 → sgie1 …

Currently deepstream reference app can support the pipeline like deepstream-test2, but it cannot support the pipeline you mentioned.

For deepstream-test2, the config files for the SGIE elements all contain the line operate-on-gie-id=1. Does that mean they are cascaded?

If there is only a single SGIE element, the output of that is still going to a fakesink. How would it be possible to access the metadata generated by the SGIE element from a downstream message broker?

That means the sgies will perform on the same pgie which id is 1

It will attach the inference result as metadata

where exactly the the info is? How to extract those info
if obj_meta.class_id == PGIE_CLASS_ID_VEHICLE:
car_info = pyds.NvDsVehicleObject.cast(obj_meta)
Is above code the right way?

Hi 1733208392,

Please help to open a new topic with more details. Thanks