If I want to modify nvmsgconv, may I ask how to nvmsgconv and how to link it to the deepstreamtest5 app?

Please provide complete information as applicable to your setup.

**• Hardware Platform (Jetson / GPU)6.1
**• DeepStream Version 7.1
**• JetPack Version (valid for Jetson only)6.1
**• TensorRT Version 10.3
**• NVIDIA GPU Driver Version (valid for GPU only)540.4.0
• Issue Type( questions, new requirements, bugs)
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

If I want to modify nvmsgconv, may I ask how to nvmsgconv and how to link it to the deepstreamtest5 app? (Regarding the third question)

plese refer to \opt\nvidia\deepstream\deepstream\sources\libs\nvmsgconv\README for how to build nvmsgconv. \ opt\nvidia\deepstream\deepstream-\sources\apps\sample_apps\deepstream-test5\configs\test5_dec_infer-resnet_tracker_sgie_tiled_display_int8.txt includes nvmsgconv and nvmsgborker configurations. you can use this file to test sending borker of test5.

“\opt\nvidia\deepstream\deepstream\sources\libs\nvmsgconv\README for how to build nvmsgcon” is done

The config I used is test5_config_file_nvmultiurisrcbin_src_list_attr_all.txt

At default, test5_config_file_nvmultiurisrcbin_src_list_attr_all.txt disables sending broker. you can set enable=1 in [sink1] to enable.

When I use this test5_config_file_nvmultiurisrcbin_src_list_attr_all.txt, if I set set enable=0 in [sink1].

Now I should set sink0 type 6 and enable msgbroker and msgconv, is it ok?

I corrected my last comment. you can set enable=1 in [sink1] to enable sending kakfa by msgconv and msgbroker.


but I modify the objectObj adding confidence . I don’t see any information in the message
(I have compiled and installed the nvmsgconv gst-nvmsgconv,)

In our business here, we need to add some fields. What I know now are confidence and images

I want to ask this questions

I use redis

if you want to add some fields, please refer to osd_sink_pad_buffer_image_probe of \opt\nvidia\deepstream\deepstream\sources\apps\sample_apps\deepstream-test4\deepstream_test4_app.c, you can add a new user meta with type NVDS_CUSTOM_MSG_BLOB. the user meta corresponds a NvDsCustomMsgInfo which include new data. please also refer to " 3.Send the image by the broker based on Kafka" in readme.

Thanks
Can’t I modify the underlying libraries?

To be specific,

  1. if msg2p-newapi is set to 1, you can method in my last comment. and you don’t need to modify the low-level lib. low-level lib is opensource. in generate_dsmeta_message of \opt\nvidia\deepstream\deepstream\sources\libs\nvmsgconv\deepstream_schema\dsmeta_payload.cpp, the data in NvDsCustomMsgInfo will be added to payload.
  2. if msg2p-newapi is set to 0, please read the summary on Nov 28 in this topic. in this method, you need to modify the low-level lib.

Thanks , I know


I ask this question why the confidence is 0?

you can add logs in app layers and nvmsgconv lib to check if confidence is 0. if you are using nvinfer, you can use pre-cluster-threshold or other configurations to filter the objects with confidence is 0. please refer to dstest1_pgie_config.txt.

I checked the configuration file and it looks fine, right. I added printing in the msgconv module, and the output is 0
Let me take a look at the infer first


Do I need to provide a configuration file or something else? Could it be a configuration issue

  1. please add log in app layer to check confidence , for example, in the code after generate_event_msg_meta.
  2. are you using own model? if using elgsink or filesink, can you see the correct bboxes?

1、I can try it
2、I use the default model . There is not any show devide The sink0 is fakesink
test5_config_file_nvmultiurisrcbin_src_list_attr_all.txt (8.7 KB)

I added printing to the function gie_prima rysprocessing.doneobuf_prob and found that the confidence level is not 0. This indicates that it is not due to the infer


module and needs to be investigated from which directions