getPluginCreator could not find plugin: ProposalDynamic version: 1

Hello,

I exported a model from HDF5 to ONNX to a tensorRT Engine. It’s a fastercnn model.

When I try inference with TensorRT (following this notebook), I get an error:

runtime = trt.Runtime(TRT_LOGGER)
with open(engine_file, "rb") as f:
    engine = runtime.deserialize_cuda_engine(f.read())

==>

[03/11/2024-09:03:12] [TRT] [E] 3: getPluginCreator could not find plugin: ProposalDynamic version: 1
[03/11/2024-09:03:12] [TRT] [E] 1: [pluginV2Runner.cpp::load::303] Error Code 1: Serialization (Serialization assertion creator failed.Cannot deserialize plugin since corresponding IPluginCreator not found in Plugin Registry)

TensorRT version: 8.6.1
Tao: 5.0.0

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks

It is needed to load the libnvinfer_plugin. You can refer to
tao_deploy/nvidia_tao_deploy/inferencer/trt_inferencer.py at dbc3da1f7d9688d6bf4b4fdba446d8dcfc4f3bc2 · NVIDIA/tao_deploy · GitHub.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.