【Deepstream6.2】How to send the tracking result to Kafka

  • Jetson Orin NX 16G.

  • Deepstream6.2.

  • I am trying the “deepstream-app -c xxxx.txt”.

Hi,
I want to run the case of source4_1080p_dec_infer-resnet_tracker_sgie_tiled_display_int8.txt, and send its tracking result to the Alikafka (Kafka in Alicloud. Please refer to: https://help.aliyun.com/zh/apsaramq-for-kafka/getting-started-overview ).

I setup all the configure files according to other samples. But I just can’t understand how to make it work by the settings in [sink1] group. The settings I am trying is list as below.
###################
[sink1]
enable=1
type=6
source-id=0
msg-conv-config=msg_conv_config.txt
msg-conv-payload-type=1
msg-broker-proto-lib=/opt/nvidia/deepstream/deepstream-6.2/lib/libnvds_kafka_proto.so
msg-broker-conn-str=alikafka-post-cn-vxxxxxxxxx02-1.alikafka.aliyuncs.com;9093
topic=ForJetson
msg-broker-config=broker_cfg.txt
###################

in which, the msg_conv_config.txt is set as below.
###################
[sensor0]
enable=1
type=Camera
id=0
location=45.293701447;-75.8303914499;48.1557479338
description=Aisle Camera
coordinate=5.2;10.1;11.2

[place0]
enable=1
id=0
type=intersection/road
name=HWY_20_AND_LOCUST__EBA
location=30.32;-40.55;100.0
coordinate=1.0;2.0;3.0
place-sub-field1=C_127_158
place-sub-field2=Lane 1
place-sub-field3=P1

[analytics0]
enable=1
id=XYZ_1
description=Vehicle Detection and License Plate Recognition
source=OpenALR
version=1.0
###################

and the broker_cfg.txt is set referring to the DeepStream SDK samples as below.
###################
[message-broker]
consumer-group-id = TestGroup
proto-cfg = “message.max.bytes=200000;log_level=6”
producer-proto-cfg = “queue.buffering.max.messages=200000;message.send.max.retries=3”
consumer-proto-cfg = “max.poll.interval.ms=20000”
partition-key = sensor.id
share-connection = 1
###################

My questions are:

  1. Are all the above settings right to realize sending the tracking result to Kafka on AliCloud? In fact, it ran without error, however no result was sent to Kafka. I confirmed that the Kafka service is normal, and I can send any message to it from Jetson by python SDK ( confluent-kafka).

  2. The broker_cfg.txt settings are from the DeesStream SDK sample. NVIDIA says that it is optional. I tried deleting the line of “msg-broker-config=broker_cfg.txt”, and then nothing was sent to the Kafka. Should I add username and password to broker_cfg.txt for accessing the Kafka on AliCloud? How to set the sink1 group?

  3. My purpose is to communicate with the Kafka on AliCloud. Could you please give me any hint or demo?

Thanks!

if not set msg-conv-msg2p-new-api, the default value means using “(0): Create payload using NvdsEventMsgMeta”. you need to create NvdsEventMsgMeta. please refer to generate_event_msg_meta in deeptream-test5. please refer to test5_dec_infer-resnet_tracker_sgie_tiled_display_int8.yml.

Sorry the msg-conv-msg2p-new-api, NvdsEventMsgMeta are all new for me. Does it mean that if I am not a C++ developer, then I should give up this method?

I just want to run the samples and send message to Kafka without advanced development. If the Kafka on AliCloud is not easy to configure, what about to deploy the Kafka on Jetson? I noticed that others set the “msg-broker-conn-str=localhost;port;topic”, I guess they have a local Kafka on Jetson instead of remote cloud service.

I also noticed that someone set the Kafka on Azure, however none of them set the account info in the configure file. This confused me.

please try test5 with test5_dec_infer-resnet_tracker_sgie_tiled_display_int8.yml, this sample already add NvdsEventMsgMeta for msg-conv-msg2p-new-api=0.
you can set msg-conv-msg2p-new-api to 1. this mode does not need to add other codes. please refer to the plugin doc.

  1. please share the new log. there will be some logs about kafka connection.
  2. test5, nvmsgconv and nvmsgborker are opensource. you add log in generate_event_msg_meta to check if there function is called.
  3. there is a tool code to test kafka sending in /opt/nvidia/deepstream/deepstream/sources/libs/kafka_protocol_adaptor/. please refer to the readme to test.
  1. Do you mean the log during running deepstream-test5-app? I only have the screen print as list above, and found no other log files. As for the kafka, it is on AliCloud, and the topic has no update record.

  2. From the readme, it seems that it is based on a local install Kafka, instead of cloud service. If i want to connect to the remote cloud kafka service, I may need to configure the cfg_kafka.txt. The readme gives its template as below, reminding me “You can add xxx and connection related details in cfg_kafka.txt”. I think the password and username is necessary to access the remote kafka service, but i don’t know how to configure them. What about you idea?

Kafka cfg:

You can add Kafka configuration and connection related details in cfg_kafka.txt
Uncomment the fields you may want to edit and add proper values

example:
[message-broker]
consumer-group-id = mygrp
proto-cfg = “message.max.bytes=200000;log_level=6”
producer-proto-cfg = “queue.buffering.max.messages=200000;message.send.max.retries=3”
consumer-proto-cfg = “max.poll.interval.ms=20000”
partition-key = mykey
#share-connection = 1

  1. the log of “deepstream-test5-app -c test5_dec_infer-resnet_tracker_sgie_tiled_display_int8.txt”, if connection failed, there will be some related log. please refer to this topic. please also try “msg-conv-msg2p-new-api=1” in test5_dec_infer-resnet_tracker_sgie_tiled_display_int8.txt.
  2. did you modify test5_dec_infer-resnet_tracker_sgie_tiled_display_int8.txt? if yes, please share the diff.
  3. the default connection address of test tool is "“localhost;9092”. you can find it in cpp file.
  1. Yes, I run the configure following your advise, and there is no log related to kafka connection. It is a very silent failure, and the screen runs normal without any error (I list the full log at the bottom of this reply). I always use “msg-conv-msg2p-new-api=1”.

  2. I’d like to paste the full configure file at the bottom of this reply.

  3. As the AliCloud Kafka uses the SASL_SSL certification, no access will be built without username and password. Maybe I should add these info in the parameter of “msg-broker-conn-str”… But no idea on how to add.

there is no any connection failed log. In theory, if there is bbox, nvmsgconv will pack Json information, nvmsgbroker will send information to the server. here are some methods to debug.

  1. you can use the network to check if the packets are sent.
  2. please refer to this topic for how to get the broker 's low level log.
  3. you can add log in new_gst_nvmsgbroker_render or legacy_gst_nvmsgbroker_render to print the content of payload->payload. the whole path is /opt/nvidia/deepstream/deepstream/sources/gst-plugins/gst-nvmsgbroker/. please rebuild the code and replace /opt/nvidia/deepstream/deepstream/lib/gst-plugins/libnvdsgst_msgbroker.so after modify the code.
  1. The network and service are OK. I tested them by a specified Python SDK in Jetson, and found the message was successfully sent to Kafka. In this test, I set more connection parameters than the deepstream settings. In deepstream it only requires “hostaddress;port;topic”. In that Python SDK, it additionally requires username, password, ca_location.

  2. Regarding the current info, I don’t think I should check more logs. Maybe the fast way is to confirm the correct setting format to access the AliCloud Kafka with SASL_SSL protocol.

  1. please refer to this security-for-kafka and this topic for TLS configuration.
  2. if still can’t work. please enable low-level log mentioned above.
  3. that payload->payload is the message content.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.