Please provide complete information as applicable to your setup.
• Hardware Platform (Jetson / GPU) Jetson AGX Orin
• DeepStream Version 6.1
• JetPack Version (valid for Jetson only) 5.0.2
• TensorRT Version 8.4.1-1+cuda11.4
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs) question
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)
I have a program based on deepstream-test5 application to detect vehicles in my Jetson Orin. To identify each vehicle, I put aruco marker on them so I am using
dsexample plugin with OpenCV(to use aruco function in
opencv_contrib) to get the result I want to send to my kafka server.
But I’m encountering a problem that the results of aruco marker are not sent to kafka. They don’t even appear in test5. I confirm that I added
/opt/nvidia/deepstream/deepstream/sources/includes/nvdsmeta.h and added
/opt/nvidia/deepstream/deepstream/sources/includes/nvdsmeta_schema.h and transfer the value with
meta->arucoId = obj_params->aruco_id; in
generate_event_msg_meta(). But the value of
aruco_id is always 0 even it works fine in dsexample.
I analyzed the structure of test5 and I found the cause maybe the order of pipeline. In the function
/opt/nvidia/deepstream/deepstream/sources/apps/sample_apps/deepstream-app/deepstream_app.c), dsexample is added after the common elements(include
msg_conv) when creating the pipeline so it seems that the
aruco_id is generated after kafka message is sent?
I have tried modifying it by adding the dsexample to
create_common_elements() to make it works before
msg_conv but I haven’t succeeded yet.
Do you know the correct way? please tell me.
deepstream-app is opensource, you can customize it.
- you can use this method to dump the media pipeline. the ideal pipeline is … dsexample-> …->msg_conv->msgbroker.
- the probe function should be added after dsexample.
Thanks for your reply.
Using the method you mentioned, I confirmed the graphics of my pipeline. It includes dsexample actually.
But I confused because other elements of pipeline such as pgie, msg_conv are defined as common elements in
create_common_elements() but dsexample is not. so I don’t know how to add it to the function correctly.
Can you give me some details about changing the order of dsexample?
Thank you for your support.
create_common_elements including pgie is only called in create_pipeline. dsexample will be inserted after create_common_elements.
why do you want to changing the order of dsexample?
According to my understanding, if I want to send the value generated in dsexample to kafka, about dsexample and msg_conv, the order of them in pipeline should be
msg_conv. But msg_conv is already included in common elements before the dsexample is added. Is this correct?
if you add [message-converter] separately in configuration file, msg_conv will be created in create_common_elements. if you add type=6 sink and don’t add [message-converter], both msg_conv and msgbroker are created in create_msg_conv_broker_bin. please refer to section “9. Multiple broker sinks” in \opt\nvidia\deepstream\deepstream\sources\apps\sample_apps\deepstream-test5\README.
In my case, the sink I am using is set to type=6 and I’m not using [message-converter] in my test5 config file. So, it has nothing to do with msg_conv in
create_common_elements(). That is my mistake. Thanks for your advice.
Now, I am trying to add the probe function for dsexample as you mentioned. But this seems a bit difficult for me because it requires some parameters. For example,
How can I find these parameters for dsexample? or can you tell me how to do it correctly?
you can add probe function in create_dsexample_bin, which creates dsexample element. please refer to deeptream-tet1 for how to add probe function.
I have learned about test1 app and trying to figure it out.
Sorry for the late reply, Is this still an DeepStream issue to support? Thanks! please open a new topic if having other DeepStream problems.
This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.