I wanna send metadata kafka in deepstream-transfer-learning-app. Anyone help me

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) GPU RTX 3060
• DeepStream Version 6.1
• NVIDIA GPU Driver Version (valid for GPU only) version 515

here config ds_transfer_learning_app_example.txt

[application]
enable-perf-measurement=1
perf-measurement-interval-sec=5
#gie-kitti-output-dir=streamscl

[tiled-display]
enable=1
rows=1
columns=1
width=1280
height=720
gpu-id=0
#(0): nvbuf-mem-default - Default memory allocated, specific to particular platform
#(1): nvbuf-mem-cuda-pinned - Allocate Pinned/Host cuda memory applicable for Tesla
#(2): nvbuf-mem-cuda-device - Allocate Device cuda memory applicable for Tesla
#(3): nvbuf-mem-cuda-unified - Allocate Unified cuda memory applicable for Tesla
#(4): nvbuf-mem-surface-array - Allocate Surface Array memory, applicable for Jetson
nvbuf-memory-type=0

[source0]
enable=1
#Type - 1=CameraV4L2 2=URI 3=MultiURI 4=RTSP
type=3
uri=file://…/…/…/…/…/samples/streams/sample_1080p_h264.mp4
num-sources=1
#drop-frame-interval=2
gpu-id=0

(0): memtype_device - Memory type Device

(1): memtype_pinned - Memory type Host Pinned

(2): memtype_unified - Memory type Unified

cudadec-memtype=0

[sink0]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File
type=2
sync=1
source-id=0
gpu-id=0
nvbuf-memory-type=0

[sink1]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File 4=UDPSink 5=nvoverlaysink 6=MsgConvBroker
type=6
msg-conv-config=dstest5_msgconv_sample_config.txt
#(0): PAYLOAD_DEEPSTREAM - Deepstream schema payload
#(1): PAYLOAD_DEEPSTREAM_MINIMAL - Deepstream schema payload minimal
#(256): PAYLOAD_RESERVED - Reserved type
#(257): PAYLOAD_CUSTOM - Custom schema payload
msg-conv-payload-type=0
msg-broker-proto-lib=/opt/nvidia/deepstream/deepstream-6.1/lib/libnvds_kafka_proto.so
msg-broker-conn-str=localhost;9092;test1
topic=test1

[message-converter]
enable=0
msg-conv-config=dstest5_msgconv_sample_config.txt
#(0): PAYLOAD_DEEPSTREAM - Deepstream schema payload
#(1): PAYLOAD_DEEPSTREAM_MINIMAL - Deepstream schema payload minimal
#(256): PAYLOAD_RESERVED - Reserved type
#(257): PAYLOAD_CUSTOM - Custom schema payload
msg-conv-payload-type=1
#Name of library having custom implementation.
msg-conv-msg2p-lib=/opt/nvidia/deepstream/deepstream-6.1/lib/libnvds_msgconv.so
#Id of component in case only selected message to parse.
msg-conv-comp-id=0

[osd]
enable=1
gpu-id=0
border-width=1
text-size=15
text-color=1;1;1;1;
text-bg-color=0.3;0.3;0.3;1
font=Serif
show-clock=0
clock-x-offset=800
clock-y-offset=820
clock-text-size=12
clock-color=1;0;0;0
nvbuf-memory-type=0

[streammux]
gpu-id=0
##Boolean property to inform muxer that sources are live
live-source=0
batch-size=1
##time out in usec, to wait after the first buffer is available
##to push the batch even if the complete batch is not formed
batched-push-timeout=40000

Set muxer output width and height

width=1920
height=1080
##Enable to maintain aspect ratio wrt source, and allow black borders, works
##along with width, height properties
enable-padding=0
nvbuf-memory-type=0

If set to TRUE, system timestamp will be attached as ntp timestamp

If set to FALSE, ntp timestamp from rtspsrc, if available, will be attached

attach-sys-ts-as-ntp=1

config-file property is mandatory for any gie section.

Other properties are optional and if set will override the properties set in

the infer config file.

[primary-gie]
enable=1
gpu-id=0
model-engine-file=…/…/…/…/…/samples/models/Primary_Detector/resnet10.caffemodel_b4_gpu0_int8.engine
batch-size=4

uncomment down line for preprocess

input-tensor-meta=0
#Required by the app for OSD, not a plugin property
bbox-border-color0=1;0;0;1
bbox-border-color1=0;1;1;1
bbox-border-color2=0;0;1;1
bbox-border-color3=0;1;0;1
interval=0
gie-unique-id=1
nvbuf-memory-type=0
config-file=config_infer_primary_ds_transfer_learning.txt

[pre-process]
enable=0
config-file=config_preprocess.txt

[nvds-analytics]
enable=1
config-file=config_nvdsanalytics.txt

[tracker]
enable=1

For NvDCF and DeepSORT tracker, tracker-width and tracker-height must be a multiple of 32, respectively

tracker-width=640
tracker-height=384
ll-lib-file=/opt/nvidia/deepstream/deepstream/lib/libnvds_nvmultiobjecttracker.so

ll-config-file required to set different tracker types

ll-config-file=…/…/…/…/…/samples/configs/deepstream-app/config_tracker_IOU.yml

ll-config-file=…/…/…/…/…/samples/configs/deepstream-app/config_tracker_NvDCF_perf.yml

ll-config-file=…/…/…/…/…/samples/configs/deepstream-app/config_tracker_NvDCF_accuracy.yml

ll-config-file=…/…/…/…/…/samples/configs/deepstream-app/config_tracker_DeepSORT.yml

gpu-id=0
enable-batch-process=1
enable-past-frame=1
display-tracking-id=1

[tests]
file-loop=0

[img-save]
enable=1
output-folder-path=./output
save-img-cropped-obj=1
save-img-full-frame=1
frame-to-skip-rules-path=capture_time_rules.csv
second-to-skip-interval=1
min-confidence=0.9
max-confidence=1.0
min-box-width=5
min-box-height=5

Is there a specific question or issue? The example “deepstream-test5” can demonstrate the connection to kafaka, not sure if this is what you want

I want to connect kafka with deepstream transfer learning app for use [img-save]. which seems like deepstream transfer learning app has no connection to kafka.

You can refer to test4 sample for how to use nvmsgbroker.

I can use nvmsgbroker in test4 and test5 but i cant use in deepstream -transfer-learning-app.

Oh, It’s based on deepstream-app.
what error you met?

i set config like this :
[sink1]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File 4=UDPSink 5=nvoverlaysink 6=MsgConvBroker
type=6
msg-conv-config=dstest5_msgconv_sample_config2.txt
msg-conv-payload-type=0
msg-broker-proto-lib=/opt/nvidia/deepstream/deepstream-6.1/lib/libnvds_kafka_proto.so
#Provide your msg-broker-conn-str here
#msg-broker-conn-str=localhost;9092;quickstart-events
msg-broker-conn-str=localhost;9092;test

it sends data to kafka in deepstream test4 and 5, but when running deepstream-transfer-learning-app no data is sent back, even though the program still runs normally

What do you mean no data is sent back? the device to cloud send callback? or the cloud to device message?

I mean when I enable =1 sink1 to use kafka, it doesn’t seem to be able to connect to kafka. You can help me, how to use kafka in deepstream transfer learning app

the config set correctly. you can enable logging to see more details.
sudo chmod +x sources/tools/nvds_logger/setup_nvds_logger.sh
sudo ./sources/tools/nvds_logger/setup_nvds_logger.sh


when i check, i get the result like this

but when run * bin/kafka-console-consumer.sh --topic quickstart-events --from-beginning --bootstrap-server localhost:9092 to view event information. i got nothing

Did you publish the message to topic quickstart-events?

how to publish the message, help me


Is this what you’re talking about?

I mean which topic you set in config?

bin/kafka-console-consumer.sh --topic test --from-beginning --bootstrap-server localhost:9092

msg-broker-conn-str=localhost;9092;test
topic=test

i set like this.

@Amycao @yingliu help me, please

Holiday these days, will reply after 10/7, sorry for delay!

1 Like

Can you consume the message by using test5 or test4 sample?
bin/kafka-console-consumer.sh --topic test --from-beginning --bootstrap-server localhost:9092

bin/kafka-console-consumer.sh --topic test --from-beginning --bootstrap-server localhost:9092
it works very well. when run with test5 or test4 sample.