±----------------------------------------------------------------------------+
| Processes: |
| GPU GI CI PID Type Process name GPU Memory |
| ID ID Usage |
|=============================================================================|
| 0 N/A N/A 2485 G /usr/lib/xorg/Xorg 4MiB |
±----------------------------------------------------------------------------+ • DeepStream Version
Deepstream 6.2
I’m working with the DeepStream SDK and have set up a pipeline with primary and secondary inferences. My secondary GIE is configured with operate-on-class-ids=0, meaning it should operate on the bounding boxes of the specific class ID from the primary GIE.
I would like to extract these bounding box coordinates as they are received by the secondary GIE. Could someone guide me on how to access these coordinates? Specifically, which section of the code or API calls should I focus on to retrieve the bounding box coordinates in the secondary GIE?
You can just add the probe fucntion to the source pad of the sgie plugin.
About how to get the coordinates, you can refer to the code below to get the coordinates.
Thank you for the suggestion to add a probe function to the source pad of the SGIE plugin for obtaining object coordinates. However, I’m not entirely clear on how to implement this. Could you please provide further details?
How do I add a probe function to the SGIE’s source pad within the DeepStream Test5 application?
Which plugin or section of the code should I refer to for this implementation?
Could you kindly indicate the file path where this probe function needs to be added?
If you use Test5, you can refer to the deepstream-app open source: opt\nvidia\deepstream\deepstream\sources\apps\sample_apps\deepstream-app.
If you just want to get the bbox and other values, you can directly set the gie-kitti-output-dir field in the config file.
Could you please specify the name of the function or plugin within the DeepStream SDK that is responsible for passing the bounding boxes generated by the primary detector to the secondary detector for further inference? We are currently using the DeepStream Test 5 application.
There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks