Python TensorRT loaded engine failed

I try the tutorial, and I export the “model_gn.engine” successfully.
I want to load the engine file with code:

import tensorrt as trt

engine_path="model_gn.engine"

TRT_LOGGER = trt.Logger(trt.Logger.WARNING)
with open(engine_path, "rb") as f, trt.Runtime(TRT_LOGGER) as runtime:
    engine = runtime.deserialize_cuda_engine(f.read())

Then I got the error

[04/23/2023-15:53:59] [TRT] [E] 1: [pluginV2Runner.cpp::load::292] Error Code 1: Serialization (Serialization assertion creator failed.Cannot deserialize plugin since corresponding IPluginCreator not found in Plugin Registry)
[04/23/2023-15:53:59] [TRT] [E] 4: [runtime.cpp::deserializeCudaEngine::49] Error Code 4: Internal Error (Engine deserialization failed.)

Is there any problem?

Here is my environment info:
image

pip list | grep tensorrt
tensorrt                      8.4.1.5

Hi,

The model uses the plugin library.
Could you init the plugin before deserializing the model?

TRT_LOGGER = trt.Logger(trt.Logger.WARNING)
trt.init_libnvinfer_plugins(self.logger, namespace="")

Below is a Python sample for your reference:

Thanks.

1 Like

That works! Thanks a lot!

Can you give a example with C++?

Hi,

Please find the sample below:

Thanks.

Okay! Thanks!

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.