Where the plugin will be in Deepstream pipeline which is enabled in config file?

Hi guys,

I’m working with deepstream apps. I understand that deepstream is based on gstreamer pipeline structure, but I don’t understand where the plugins will be in the pipeline if they are enabled in the config file?

For example, I’m using deepstream-test5-app and I enable ds-example plugin in the config file. Where is the actual position of the plugin in the deepstream-test5 pipeline?


This is the link relation between elements, other plugins->pgie->dsexample->tracker->other plugins


I wonder if I can or I “should” redesign the pipeline?
For example, the work I’m doing is like I need to send two different json message to two kafka topic. One message includes detected objects information, the other includes the frame (in base64 string format).

I’ve already customized the former and can send to kafka, but I’ve no idea how to make the latter work. Shall I convert frame to base64 in dsexample? If so, how can I get the base64 string in msgconv element? How to send two different format json messages to two different kafka topics?

If I enable two sink sections as type 6 (msgborker) in config file and set them to different msg-broker-conn-str, will it works? Can the two msgbrokers deal with different format json message independently?

Or is there any better or easier solution to do this task?


for your first question about convert frame to base64 format frame, you can refer to this topic,

About how to send the base64 string to message broker, you can see this document, from README of sample test4,

Generating custom metadata for different type of objects:
In addition to common fields provided in NvDsEventMsgMeta structure, user can
also create custom objects and attach to buffer as NVDS_META_EVENT_MSG metadata.
To do that NvDsEventMsgMeta provides “extMsg” and “extMsgSize” fields. User can
create custom structure, fill that structure and assign the pointer of that
structure as “extMsg” and set the “extMsgSize” accordingly.
If custom object contains fields that can’t be simply mem copied then user should
also provide function to copy and free those objects.

Refer generate_event_msg_meta() to know how to use “extMsg” and “extMsgSize”
fields for custom objects and how to provide copy/free function and attach that
object to buffer as metadata.

About multiple broker sinks, you need version DS 5.0 for this,

Multiple broker sinks

Multiple broker sinks might be required for sending a message to multipe
backends simultaneously or to send specific message to particular backend.

By default sink of type = 6 adds message converter and message broker
components in the pipeline.
In case of multiple brokers [message-converter] group can be used to add
single message converter in the pipeline with multiple sinks of type = 6
having disable-msgconv set to 1.

If multiple message converters are also required along with multiple brokers
then “msg-conv-comp-id” and “msg-broker-comp-id” should be set properly to
avoid duplicate messages. These fields force converter / broker components to
process only those messages having same value for componentId field and ignore
other messages. User should modify the application to fill componentId field of
NvDsEventMsgMeta structure.

@amycao Thanks very much for helping.
I can convert image to base64 so far, and I’ll try the kafka part afterward.

By the way, can i get source id in dsexample element? I can’t find any variable about source id in struct type GstDsExample.

You can get like this
NvDsFrameMeta-> source_id in dsexample.