How to send metadata via kafka in deepstream-transfer-learning-app

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU)
3080ti
• DeepStream Version
5.1 triton

Hello!

I want to send metadata through Kafka from the config file to the sink(type=6) in the app in the same way as the deepstream-test5 app.

here is config file(ds_transfer_learning_app_example.txt)

[source0]
enable=1
#Type - 1=CameraV4L2 2=URI 3=MultiURI 4=RTSP
type=3
#uri=file://…/…/…/…/…/samples/streams/sample_1080p_h264.mp4
uri=file://…/…/…/…/…/samples/streams/sample_qHD.mp4
num-sources=1
#drop-frame-interval=2
gpu-id=0
(0): memtype_device - Memory type Device
(1): memtype_pinned - Memory type Host Pinned
(2): memtype_unified - Memory type Unified
cudadec-memtype=0

[sink0]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File
type=2
sync=1
source-id=0
gpu-id=0
nvbuf-memory-type=0

[sink1]
enable=0
#Type - 1=FakeSink 2=EglSink 3=File 4=UDPSink 5=nvoverlaysink 6=MsgConvBroker
type=6
msg-conv-config=dstest5_msgconv_sample_config.txt
#(0): PAYLOAD_DEEPSTREAM - Deepstream schema payload
#(1): PAYLOAD_DEEPSTREAM_MINIMAL - Deepstream schema payload minimal
#(256): PAYLOAD_RESERVED - Reserved type
#(257): PAYLOAD_CUSTOM - Custom schema payload
msg-conv-payload-type=0
msg-broker-proto-lib=/opt/nvidia/deepstream/deepstream-5.1/lib/libnvds_kafka_proto.so
#Provide your msg-broker-conn-str here
msg-broker-conn-str=192.168.0.6;9092;mtest0
topic=mtest0
#Optional:
#msg-broker-config=…/…/deepstream-test4/cfg_kafka.txt

[img-save]
enable=1
output-folder-path=/opt/nvidia/deepstream/deepstream-5.1/sources/apps/sample_apps/deepstream-transfer-learning-app/configs/output
save-img-cropped-obj=1
save-img-full-frame=0
frame-to-skip-rules-path=/opt/nvidia/deepstream/deepstream-5.1/sources/apps/sample_apps/deepstream-transfer-learning-app/configs/capture_time_rules.csv
second-to-skip-interval=10
min-confidence=0.5
max-confidence=1.0
min-box-width=1
min-box-height=1

When running with the above settings, the transmission was not performed. I would appreciate it if you could let me know if there is anything else I need to configure or if I need to modify the code.

If you want to send the message to broker, you need to enable sink type 6
[sink1]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File 4=UDPSink 5=nvoverlaysink 6=MsgConvBroker
type=6

the transmission was not performed.

please paste the log here.

thank you for your reply

I also enable sink type 6. and this is result

Also, I’m looking for a way to send IPdata like the picture below directly to kafka even if it’s not “sink type=6”, can you help?

image
image

There no error from the output, the performance is abnormal, you need to check which component caused this.
Troubleshooting — DeepStream 6.0.1 Release documentation (nvidia.com)

thank you for relpy.

Inside the pipeline (while the app is running), “make_json_data” or “Can’t you send the json file created with make_ipdata to kafka?”

Isn’t it convienient to have the configuration files for broker setting?

This is the setting I originally used when communicating with kafka in the test5 app.
I confirmed that Kafka communication works with this

[sink0]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File 4=UDPSink 5=nvoverlaysink 6=MsgConvBroker
type=6
#source-id=0
msg-conv-config=dstest5_msgconv_sample_config.txt
#(0): PAYLOAD_DEEPSTREAM - Deepstream schema payload
#(1): PAYLOAD_DEEPSTREAM_MINIMAL - Deepstream schema payload minimal
#(256): PAYLOAD_RESERVED - Reserved type
#(257): PAYLOAD_CUSTOM - Custom schema payload
msg-conv-payload-type=1
msg-broker-proto-lib=/opt/nvidia/deepstream/deepstream-5.1/lib/libnvds_kafka_proto.so
#Provide your msg-broker-conn-str here
msg-broker-conn-str=192.168.0.6;9092;mtest1
topic=mtest1
disable-msgconv = 1
msg-conv-comp-id=0
msg-broker-comp-id=0
#Optional:

[message-converter]
enable=1
msg-conv-config=dstest5_msgconv_sample_config.txt
#(0): PAYLOAD_DEEPSTREAM - Deepstream schema payload
#(1): PAYLOAD_DEEPSTREAM_MINIMAL - Deepstream schema payload minimal
#(256): PAYLOAD_RESERVED - Reserved type
#(257): PAYLOAD_CUSTOM - Custom schema payload
msg-conv-payload-type=1
#Name of library having custom implementation.
msg-conv-msg2p-lib=/opt/nvidia/deepstream/deepstream-5.1/lib/libnvds_msgconv.so
#Id of component in case only selected message to parse.
msg-conv-comp-id=0

Where is this from?

ok. so what does this mean?

“metadata.json” is created, and I want to send this metadata through kafka as in test5

this is my json file

Got your points, first you need to add broker settings in configs/ds_transfer_learning_app_example.txt
second, you need to costomize the app. in nvmsgconv component, it have it’s own message content, if you do not need, you need to remove it.

thank you for reply

what means “need to add broker settings in configs/ds_transfer_learning_app_example.txt”?

do i have any other broker settings except below settings?

image

I mean there no broker sink in original config file.
configs/ds_transfer_learning_app_example.txt
you need to add it yourself.

[sink1]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File 4=UDPSink 5=nvoverlaysink 6=MsgConvBroker
type=6
msg-conv-config=dstest5_msgconv_sample_config.txt
#(0): PAYLOAD_DEEPSTREAM - Deepstream schema payload
#(1): PAYLOAD_DEEPSTREAM_MINIMAL - Deepstream schema payload minimal
#(256): PAYLOAD_RESERVED - Reserved type
#(257): PAYLOAD_CUSTOM - Custom schema payload
msg-conv-payload-type=0
msg-broker-proto-lib=/opt/nvidia/deepstream/deepstream-6.0/lib/libnvds_kafka_proto.so
#Provide your msg-broker-conn-str here
msg-broker-conn-str=;;
topic=
#Optional:
#msg-broker-config=…/…/deepstream-test4/cfg_kafka.txt
I see you use version 5.1, suggest you upgrade to latest version, we have new features added and many bug fixes.

thanks I will try it

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.