• Hardware Platform (Jetson / GPU) GPU
• DeepStream Version 6.0
• JetPack Version (valid for Jetson only)
• TensorRT Version 8.0.1
• NVIDIA GPU Driver Version (valid for GPU only) 495.44
• Issue Type( questions, new requirements, bugs) Question
Hi, i followed this link and implementing the PGIE+SGIE in deepstream-app.
I was wondering how can i replicate “sgie_pad_buffer_probe” function in deepstream-app?
As in deepstream-app, by enabling “process-mode=2” in config file, it acts as secondary detector/classifier. I want to use the output of secondary inference and parse the output (like landmarks).
The link demonstrates-
- pgie_pad_buffer_probe → does the primary inferencing and generates the face boxes
- sgie_pad_buffer_probe → does secondary inferencing and attaching the landamarks as meta_data.
But deepstream-app setup is a bit different. the secondary bin is created automatically when [secondary-gie0] is enabled in config file. Then where to write the code to get the secondary detector output ?
Please provide the steps how i can implement the same?