Display messages received from server through kafka with nvdsosd

Please provide complete information as applicable to your setup.
Hi! I use deepstream to send feature vector to server and get search results through message consumer. I would like to ask if there is a way to add that result to the visible part. I have consulted deepstream-infer-tensor-meta-test for how to overlay label/text but still I really have no idea how to get result from message consumer assign to display meta. Please give me suggestions or any examples that solve this problem. Thanks!
• Hardware Platform (GPU)
• DeepStream Version 5.1
• JetPack Version (valid for Jetson only)
• TensorRT Version 7.1
• NVIDIA GPU Driver Version (valid for GPU only) 460
• Issue Type( questions)

Do you mean you want to send feature vector to server though gstmsgbroker and then the server will return some text back to deepstream pipeline by gstmsgbroker?

Yes, The text that will be returned from the server is currently displayed on the console by default, but I want to include it as sgie’s label on nvdsosd

Can you give me any suggestions? Thank you very much @Fiona.Chen

Which text? Do you mean you want the frame text be displayed as object text?

In the config file I have set the following for the message consumer:

This consumer subscribed to topic ‘test2’ to receive a message that includes a label for the vector I send to the server. Specifically I want to replace the label in this received message with the ‘Face’ label as shown in the image. For example, I send the vector from the output of sgie to the server and the server will do a search to return the label ‘nhat’, now the message received by the consumer will include this label ‘nhat’, and I wants ‘nhat’ to be displayed instead of ‘face’. Sorry my english is not very good to express! @Fiona.Chen

Object label is obj_label in “NvDsObjectMeta” struct. NVIDIA DeepStream SDK API Reference: _NvDsObjectMeta Struct Reference
You can change it if you think the vector can be used on any frame.

hello, what kind of model are you working?

Currently I am running with deepstream-test5, message-consumer0 configured as above. I obtained the payload as shown in the image. As far as I know massage is parsed in the nvds_c2d_parse_cloud_message() function ( /opt/nvidia/deepstream/deepstream-5.1/sources/apps/apps-common/src/deepstream_c2d_msg_util.c ), the result is returned via the msg variable. I want to ask is how from deepstream_test5_app_main.c I can access this msg to assign it to objmeta. Thank you very much!

Hi, my processing flow consists of 1 detector model, 1 model to extract features, then the feature vector will send to the server for processing. Glad if you have any suggestions for me

Please refer to Gst-nvmsgbroker — DeepStream 6.0 Release documentation

Thank you for the reply, I will try it.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.