How to add BatchedNMSDynamic_TRT Plugin to a ONNX model?

Description

I want to add BatchedNMSDynamic_TRT plugin to my SSD ONNX model in the end. I want output model to be also in ONNX form only. (or any other format but not TRT model.) My current model has two output (boxes and sources), one related to coordinate of bbox and another is corresponding confidence scores. Please help me on this. Python code is preferred.

Environment

• Hardware Platform (GPU)
• DeepStream Version 6.2
• TensorRT Version 8.5.2
• NVIDIA GPU Driver Version 525.85.12
• Issue Type (questions)

Hi,
Please refer to below links related custom plugin implementation and sample:

While IPluginV2 and IPluginV2Ext interfaces are still supported for backward compatibility with TensorRT 5.1 and 6.0.x respectively, however, we recommend that you write new plugins or refactor existing ones to target the IPluginV2DynamicExt or IPluginV2IOExt interfaces instead.

Thanks!

1 Like