Please provide complete information as applicable to your setup.
• Hardware Platform (GPU)
• DeepStream Version 5.0
• TensorRT Version 7.2.1-1
• NVIDIA GPU Driver Version (valid for GPU only) 450.102.04
• Issue Type( questions)
Pipeline gets stuck when pgie detects 6 objects bbox and all these are send to secondary inference custom nn
I detected bbox for object in primary inference, then I added another nvinfer for secondary inference. The secondary nvinfer is a custom NN model and operating in secondary mode. For custom NN data parsing, we are using a probe.
The pipeline works just fine if we use an easy video with one or two objects. But with 6 bbox detected, the pipeline gets stuck. So for the first frame the app encounters 6 bbox, it draws as expected and then it freezes- it does not go to the next frame at all, I am not sure why this is the case. I tried different video and when it detects 6 bbox, it gets stuck!
I tracked the steps and I could see that when it gets stuck, it was actually inside the probe callback function and it was going to return GST_PAD_PROBE_OK; So I think after this or while doing this the app get stuck.
Update: In the primary inference txt file I selected topk=6 and found no freeze issues. This limits the bbox detected to max 6 objects but runs without freeze issues. But when I have topk=7 and there are 7 or more objects in the image, the pipeline freezes drawing the 6 bbox and custom NN drawings on these 6 bboxes. I would prefer to not limit the drawing to only 6 objects if there is any other way around.
How can I solve this issue?