How to set config file for multiple sources and kafka broker(sink type=6) in test5-app

Hello!
I’m studying deepstream with a lot of help from here and I’m very grateful

I want to transmit the metadata of multiple input videos through kafka.
for example) 3 sources / 3 sink type=6

I have already successfully tested kafka for one input through test5.
However, if there are multiple inputs, an error occurs.

this is my config file

[sink5]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File 4=UDPSink 5=nvoverlaysink 6=MsgConvBroker
type=6
source-id=0
msg-conv-config=dstest5_msgconv_sample_config.txt
#(0): PAYLOAD_DEEPSTREAM - Deepstream schema payload
#(1): PAYLOAD_DEEPSTREAM_MINIMAL - Deepstream schema payload minimal
#(256): PAYLOAD_RESERVED - Reserved type
#(257): PAYLOAD_CUSTOM - Custom schema payload
msg-conv-payload-type=0
msg-broker-proto-lib=/opt/nvidia/deepstream/deepstream-5.1/lib/libnvds_kafka_proto.so
#Provide your msg-broker-conn-str here
msg-broker-conn-str=192.168.20.12;9092;test9150
topic=test9150
#Optional:
msg-broker-config=…/…/deepstream-test4/cfg_kafka.txt

[sink6]
enable=1
source-id=1
#Type - 1=FakeSink 2=EglSink 3=File 4=UDPSink 5=nvoverlaysink 6=MsgConvBroker
type=6
msg-conv-config=dstest5_msgconv_sample_config.txt
#(0): PAYLOAD_DEEPSTREAM - Deepstream schema payload
#(1): PAYLOAD_DEEPSTREAM_MINIMAL - Deepstream schema payload minimal
#(256): PAYLOAD_RESERVED - Reserved type
#(257): PAYLOAD_CUSTOM - Custom schema payload
msg-conv-payload-type=0
msg-broker-proto-lib=/opt/nvidia/deepstream/deepstream-5.1/lib/libnvds_kafka_proto.so
#Provide your msg-broker-conn-str here
msg-broker-conn-str=192.168.20.12;9092;test9151
topic=test9151
#Optional:
msg-broker-config=…/…/deepstream-test4/cfg_kafka.txt

Can you upload the complete config file?

this is config file

[application]
enable-perf-measurement=1
perf-measurement-interval-sec=5
#gie-kitti-output-dir=streamscl

[tiled-display]
enable=0
rows=1
columns=2
width=1280
height=720
gpu-id=0
nvbuf-memory-type=0

[source0]
enable=1
#Type - 1=CameraV4L2 2=URI 3=MultiURI
type=3
uri=file://…/…/…/…/…/samples/streams/sample_720p.h264
#uri=file://…/…/…/…/…/samples/streams/sample_qHD.mp4
num-sources=1
gpu-id=0
source-id=0
nvbuf-memory-type=0
cudadec-memtype=0

[source1]
enable=0
#Type - 1=CameraV4L2 2=URI 3=MultiURI
type=3
uri=file://…/…/…/…/…/samples/streams/sample_qHD.h264
num-sources=1
gpu-id=0
source-id=1
nvbuf-memory-type=0
cudadec-memtype=0

[sink0]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File
type=2
sync=0
source-id=0
gpu-id=0
nvbuf-memory-type=0

[sink1]
enable=1
type=4
source-id=0
codec=1
#enc-type=0
sync=0
bitrate=4000000
#profile=0
rtsp-port=8554
udp-port=5000
gpu-id=0
nvbuf-memory-type=0

[sink2]
enable=0
#Type - 1=FakeSink 2=EglSink 3=File
type=2
sync=0
source-id=1
gpu-id=0
nvbuf-memory-type=0

[sink3]
enable=0
type=4
source-id=1
codec=1
#enc-type=0
sync=0
bitrate=4000000
#profile=0
rtsp-port=8555
udp-port=5001
gpu-id=0
nvbuf-memory-type=0

[sink5]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File 4=UDPSink 5=nvoverlaysink 6=MsgConvBroker
type=6
#source-id=0
msg-conv-config=dstest5_msgconv_sample_config.txt
#(0): PAYLOAD_DEEPSTREAM - Deepstream schema payload
#(1): PAYLOAD_DEEPSTREAM_MINIMAL - Deepstream schema payload minimal
#(256): PAYLOAD_RESERVED - Reserved type
#(257): PAYLOAD_CUSTOM - Custom schema payload
msg-conv-payload-type=0
msg-broker-proto-lib=/opt/nvidia/deepstream/deepstream-5.1/lib/libnvds_kafka_proto.so
#Provide your msg-broker-conn-str here
msg-broker-conn-str=192.168.20.12;9092;test9150
topic=test9150
#Optional:
msg-broker-config=…/…/deepstream-test4/cfg_kafka.txt

[sink6]
enable=0
source-id=1
#Type - 1=FakeSink 2=EglSink 3=File 4=UDPSink 5=nvoverlaysink 6=MsgConvBroker
type=6
msg-conv-config=dstest5_msgconv_sample_config.txt
#(0): PAYLOAD_DEEPSTREAM - Deepstream schema payload
#(1): PAYLOAD_DEEPSTREAM_MINIMAL - Deepstream schema payload minimal
#(256): PAYLOAD_RESERVED - Reserved type
#(257): PAYLOAD_CUSTOM - Custom schema payload
msg-conv-payload-type=0
msg-broker-proto-lib=/opt/nvidia/deepstream/deepstream-5.1/lib/libnvds_kafka_proto.so
#Provide your msg-broker-conn-str here
msg-broker-conn-str=192.168.20.12;9092;test9151
topic=test9151
#Optional:
msg-broker-config=…/…/deepstream-test4/cfg_kafka.txt

[message-converter]
enable=0
msg-conv-config=dstest5_msgconv_sample_config.txt
#(0): PAYLOAD_DEEPSTREAM - Deepstream schema payload
#(1): PAYLOAD_DEEPSTREAM_MINIMAL - Deepstream schema payload minimal
#(256): PAYLOAD_RESERVED - Reserved type
#(257): PAYLOAD_CUSTOM - Custom schema payload
msg-conv-payload-type=0
#msg-conv-msg2p-lib=
#msg-conv-comp-id=

Configure this group to enable cloud message consumer.

[message-consumer0]
enable=0
proto-lib=/opt/nvidia/deepstream/deepstream-5.1/lib/libnvds_kafka_proto.so
conn-str=;
config-file=
subscribe-topic-list=;;
#sensor-list-file=dstest5_msgconv_sample_config.txt

[osd]
enable=1
gpu-id=0
border-width=1
text-size=15
text-color=1;1;1;1;
text-bg-color=0.3;0.3;0.3;1
font=Arial
show-clock=0
clock-x-offset=800
clock-y-offset=820
clock-text-size=12
clock-color=1;0;0;0
nvbuf-memory-type=0

[streammux]
gpu-id=0
##Boolean property to inform muxer that sources are live
live-source=0
batch-size=4
##time out in usec, to wait after the first buffer is available
##to push the batch even if the complete batch is not formed
batched-push-timeout=40000
width=1920
height=1080
##Enable to maintain aspect ratio wrt source, and allow black borders, works
##along with width, height properties
enable-padding=0
nvbuf-memory-type=0
#If set to TRUE, system timestamp will be attached as ntp timestamp
#If set to FALSE, ntp timestamp from rtspsrc, if available, will be attached
#attach-sys-ts-as-ntp=1

#config-file property is mandatory for any gie section.
#Other properties are optional and if set will override the properties set in
#the infer config file.
[primary-gie]
enable=1
gpu-id=0
#Required to display the PGIE labels, should be added even when using config-file
#property
batch-size=2
#Required by the app for OSD, not a plugin property
bbox-border-color0=1;0;0;1
bbox-border-color1=0;1;1;1
bbox-border-color2=0;0;1;1
bbox-border-color3=0;1;0;1
interval=0
#Required by the app for SGIE, when used along with config-file property
gie-unique-id=1
nvbuf-memory-type=0
model-engine-file=…/…/…/…/…/samples/models/Primary_Detector/resnet10.caffemodel_b2_gpu0_int8.engine
labelfile-path=…/…/…/…/…/samples/models/Primary_Detector/labels.txt
config-file=…/…/…/…/…/samples/configs/deepstream-app/config_infer_primary.txt
#infer-raw-output-dir=…/…/…/…/…/samples/primary_detector_raw_output/

[tracker]
enable=1
tracker-width=600
tracker-height=288
ll-lib-file=/opt/nvidia/deepstream/deepstream-5.1/lib/libnvds_mot_klt.so
#ll-config-file required for DCF/IOU only
#ll-config-file=tracker_config.yml
#ll-config-file=iou_config.txt
gpu-id=0
#enable-batch-process applicable to DCF only
enable-batch-process=0

[secondary-gie0]
enable=1
gpu-id=0
gie-unique-id=4
operate-on-gie-id=1
operate-on-class-ids=0;
batch-size=16
config-file=…/…/…/…/…/samples/configs/deepstream-app/config_infer_secondary_vehicletypes.txt
labelfile-path=…/…/…/…/…/samples/models/Secondary_VehicleTypes/labels.txt
model-engine-file=…/…/…/…/…/samples/models/Secondary_VehicleTypes/resnet18.caffemodel_b16_gpu0_int8.engine

[secondary-gie1]
enable=1
gpu-id=0
gie-unique-id=5
operate-on-gie-id=1
operate-on-class-ids=0;
batch-size=16
config-file=…/…/…/…/…/samples/configs/deepstream-app/config_infer_secondary_carcolor.txt
labelfile-path=…/…/…/…/…/samples/models/Secondary_CarColor/labels.txt
model-engine-file=…/…/…/…/…/samples/models/Secondary_CarColor/resnet18.caffemodel_b16_gpu0_int8.engine

[secondary-gie2]
enable=1
gpu-id=0
gie-unique-id=6
operate-on-gie-id=1
operate-on-class-ids=0;
batch-size=16
config-file=…/…/…/…/…/samples/configs/deepstream-app/config_infer_secondary_carmake.txt
labelfile-path=…/…/…/…/…/samples/models/Secondary_CarMake/labels.txt
model-engine-file=…/…/…/…/…/samples/models/Secondary_CarMake/resnet18.caffemodel_b16_gpu0_int8.engine

[tests]
file-loop=0

Please set “source-id” in [sink5] group.

[sink5]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File 4=UDPSink 5=nvoverlaysink 6=MsgConvBroker
type=6
source-id=0
msg-conv-config=dstest5_msgconv_sample_config.txt
#(0): PAYLOAD_DEEPSTREAM - Deepstream schema payload
#(1): PAYLOAD_DEEPSTREAM_MINIMAL - Deepstream schema payload minimal
#(256): PAYLOAD_RESERVED - Reserved type
#(257): PAYLOAD_CUSTOM - Custom schema payload
msg-conv-payload-type=0
msg-broker-proto-lib=/opt/nvidia/deepstream/deepstream-5.1/lib/libnvds_kafka_proto.so
#Provide your msg-broker-conn-str here
msg-broker-conn-str=192.168.20.12;9092;test9150
topic=test9150
#Optional:
msg-broker-config=…/…/deepstream-test4/cfg_kafka.txt

[sink6]
enable=1
source-id=1
#Type - 1=FakeSink 2=EglSink 3=File 4=UDPSink 5=nvoverlaysink 6=MsgConvBroker
type=6
msg-conv-config=dstest5_msgconv_sample_config.txt
#(0): PAYLOAD_DEEPSTREAM - Deepstream schema payload
#(1): PAYLOAD_DEEPSTREAM_MINIMAL - Deepstream schema payload minimal
#(256): PAYLOAD_RESERVED - Reserved type
#(257): PAYLOAD_CUSTOM - Custom schema payload
msg-conv-payload-type=0
msg-broker-proto-lib=/opt/nvidia/deepstream/deepstream-5.1/lib/libnvds_kafka_proto.so
#Provide your msg-broker-conn-str here
msg-broker-conn-str=192.168.20.12;9092;test9151
topic=test9151
#Optional:
msg-broker-config=…/…/deepstream-test4/cfg_kafka.txt

error occur…

(deepstream-test5-app:28921): GStreamer-CRITICAL **: 04:48:14.434: gst_bin_add: assertion ‘GST_IS_ELEMENT (element)’ failed
** ERROR: <create_pipeline:1295>: create_pipeline failed
** ERROR: main:1419: Failed to create pipeline
Quitting
App run failed

Please enable it for multi brokers.

thank you

I try but it doesn’t work

this is my config file

[sink100]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File 4=UDPSink 5=nvoverlaysink 6=MsgConvBroker
type=6
source-id=0
msg-conv-config=dstest5_msgconv_sample_config.txt
#(0): PAYLOAD_DEEPSTREAM - Deepstream schema payload
#(1): PAYLOAD_DEEPSTREAM_MINIMAL - Deepstream schema payload minimal
#(256): PAYLOAD_RESERVED - Reserved type
#(257): PAYLOAD_CUSTOM - Custom schema payload
msg-conv-payload-type=0
msg-broker-proto-lib=/opt/nvidia/deepstream/deepstream-5.1/lib/libnvds_kafka_proto.so
#Provide your msg-broker-conn-str here
msg-broker-conn-str=192.168.20.12;9092;test9150
topic=test9150
msg-conv-comp-id=0
msg-broker-comp-id=0
disable-msgconv = 1
#Optional:
msg-broker-config=…/…/deepstream-test4/cfg_kafka.txt

[sink101]
enable=1
source-id=1
#Type - 1=FakeSink 2=EglSink 3=File 4=UDPSink 5=nvoverlaysink 6=MsgConvBroker
type=6
msg-conv-config=dstest5_msgconv_sample_config.txt
#(0): PAYLOAD_DEEPSTREAM - Deepstream schema payload
#(1): PAYLOAD_DEEPSTREAM_MINIMAL - Deepstream schema payload minimal
#(256): PAYLOAD_RESERVED - Reserved type
#(257): PAYLOAD_CUSTOM - Custom schema payload
msg-conv-payload-type=0
msg-broker-proto-lib=/opt/nvidia/deepstream/deepstream-5.1/lib/libnvds_kafka_proto.so
#Provide your msg-broker-conn-str here
msg-broker-conn-str=192.168.20.12;9092;test9151
topic=test9151
msg-conv-comp-id=1
msg-broker-comp-id=1
disable-msgconv = 1
#Optional:
msg-broker-config=…/…/deepstream-test4/cfg_kafka.txt

[message-converter0]
enable=1
msg-conv-config=dstest5_msgconv_sample_config.txt
#(0): PAYLOAD_DEEPSTREAM - Deepstream schema payload
#(1): PAYLOAD_DEEPSTREAM_MINIMAL - Deepstream schema payload minimal
#(256): PAYLOAD_RESERVED - Reserved type
#(257): PAYLOAD_CUSTOM - Custom schema payload
msg-conv-payload-type=0
#Name of library having custom implementation.
#msg-conv-msg2p-lib=
#Id of component in case only selected message to parse.
msg-conv-comp-id=0

[message-converter0]
enable=1
msg-conv-config=dstest5_msgconv_sample_config.txt
#(0): PAYLOAD_DEEPSTREAM - Deepstream schema payload
#(1): PAYLOAD_DEEPSTREAM_MINIMAL - Deepstream schema payload minimal
#(256): PAYLOAD_RESERVED - Reserved type
#(257): PAYLOAD_CUSTOM - Custom schema payload
msg-conv-payload-type=0
#Name of library having custom implementation.
#msg-conv-msg2p-lib=
#Id of component in case only selected message to parse.
msg-conv-comp-id=1

If message-converter enabled, it menas you just have one converter in your pipeline, you just need one, please remove another message-converter0 group. also if you set msg-conv-comp-id and msg-broker-comp-id, you need to modify the application to fill componentId field of NvDsEventMsgMeta structure

In fact, I don’t understand why each kafka output(sink type=6) doesn’t work for multiple input sources.

For example, when there are 9 input videos, I think that by changing the settings in the config file, the metadata of each of the 9 videos can be sent through kafka.

I want you to tell me exactly where in the config file to edit, or how to change the main code.

Thank you for always providing this service and helping

In that case, you just need one sink type=6 and you do not need message-converter, and you do not need to modify the code for multi source metadata sending.

thank you for your kind reply

if I need only sink type=6, Why does the error occur when I set the below?

[source0]
enable=1
#Type - 1=CameraV4L2 2=URI 3=MultiURI
type=3
uri=file://…/…/…/…/…/samples/streams/sample_qHD.mp4
#uri=file://…/…/…/…/…/samples/streams/sample_qHD.mp4
num-sources=1
gpu-id=0
source-id=0
nvbuf-memory-type=0
cudadec-memtype=0

[source1]
enable=1
#Type - 1=CameraV4L2 2=URI 3=MultiURI
type=3
uri=file://…/…/…/…/…/samples/streams/sample_qHD.mp4
num-sources=1
gpu-id=0
source-id=1
nvbuf-memory-type=0
cudadec-memtype=0

[sink5]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File 4=UDPSink 5=nvoverlaysink 6=MsgConvBroker
type=6
source-id=0
msg-conv-config=dstest5_msgconv_sample_config.txt
#(0): PAYLOAD_DEEPSTREAM - Deepstream schema payload
#(1): PAYLOAD_DEEPSTREAM_MINIMAL - Deepstream schema payload minimal
#(256): PAYLOAD_RESERVED - Reserved type
#(257): PAYLOAD_CUSTOM - Custom schema payload
msg-conv-payload-type=0
msg-broker-proto-lib=/opt/nvidia/deepstream/deepstream-5.1/lib/libnvds_kafka_proto.so
#Provide your msg-broker-conn-str here
msg-broker-conn-str=192.168.20.12;9092;test9150
topic=test9150
#Optional:
msg-broker-config=…/…/deepstream-test4/cfg_kafka.txt

[sink6]
enable=1
source-id=1
#Type - 1=FakeSink 2=EglSink 3=File 4=UDPSink 5=nvoverlaysink 6=MsgConvBroker
type=6
msg-conv-config=dstest5_msgconv_sample_config.txt
#(0): PAYLOAD_DEEPSTREAM - Deepstream schema payload
#(1): PAYLOAD_DEEPSTREAM_MINIMAL - Deepstream schema payload minimal
#(256): PAYLOAD_RESERVED - Reserved type
#(257): PAYLOAD_CUSTOM - Custom schema payload
msg-conv-payload-type=0
msg-broker-proto-lib=/opt/nvidia/deepstream/deepstream-5.1/lib/libnvds_kafka_proto.so
#Provide your msg-broker-conn-str here
msg-broker-conn-str=192.168.20.12;9092;test9151
topic=test9151
#Optional:
msg-broker-config=…/…/deepstream-test4/cfg_kafka.txt

(deepstream-test5-app:28921): GStreamer-CRITICAL **: 04:48:14.434: gst_bin_add: assertion ‘GST_IS_ELEMENT (element)’ failed
** ERROR: <create_pipeline:1295>: create_pipeline failed
** ERROR: main:1419: Failed to create pipeline
Quitting
App run failed

Please remove the cache, rm ~/.cache/gstreamer-1.0/ -rf
run again and get the full log.

스크린샷, 2021-10-21 14-43-19

that file is full log?

No.
the standard output the app will output to.

I still get this message

(deepstream-test5-app:28921): GStreamer-CRITICAL **: 04:48:14.434: gst_bin_add: assertion ‘GST_IS_ELEMENT (element)’ failed
** ERROR: <create_pipeline:1295>: create_pipeline failed
** ERROR: main:1419: Failed to create pipeline

Did you remove the cache?
rm ~/.cache/gstreamer-1.0/ -rf

yes…

You can add debug GST_DEBUG environment variable to get the debug information to find which element add failed.
GST_DEBUG=5 command

I did some debugging and found that “sink_sub_bin_sink2” was not added to the pipeline. But I couldn’t confirm the reason and I couldn’t check any other error messages.

I am most curious about whether it is possible to send the metadata of each video to each topic through multiple sinks while using deepstream.

If possible, I’d like to know the clear way. No matter how much I search the forum, I can’t find it, so I’m asking again

And I do not know exactly what the following words in the multi broker part of the READ-ME of the test-5 app

If I need to modify the app, I want to know how to do it and how to assign the id appropriately

These fields force converter / broker components to process only those messages having same value for componentId field and ignore other messages. User should modify the application to fill componentId field of NvDsEventMsgMeta structure.