Yolov4 tensorrt loading IPluginCreator not found error

Description

I mad a yolov4 tensorrt engine file using cv notebook samples of tao toolkit in python virtual envirionment, and the model was validated by using tao inference. It worked fine.

But When I tried to load in tensorRT. I got the error like this.

<_io.BufferedReader name=‘./trt.engine’>
[03/31/2023-05:23:57] [TRT] [E] 1: [pluginV2Runner.cpp::load::300] Error Code 1: Serialization (Serialization assertion creator failed.Cannot deserialize plugin since corresponding IPluginCreator not found in Plugin Registry)
[03/31/2023-05:23:57] [TRT] [E] 4: [runtime.cpp::deserializeCudaEngine::66] Error Code 4: Internal Error (Engine deserialization failed.)

What is the problem?

Environment

TensorRT Version: 8.5.1
GPU Type: 3090 TI
Nvidia Driver Version: 530.30.02
CUDA Version: 12.1
CUDNN Version: 8.8.1
Operating System + Version: Ubuntu 22.04
Python Version (if applicable): 3.10
TensorFlow Version (if applicable):
PyTorch Version (if applicable):
Baremetal or Container (if container which image + tag):

Relevant Files

Please attach or include links to any models, data, files, or scripts necessary to reproduce your issue. (Github repo, Google Drive, Dropbox, etc.)

Steps To Reproduce

Please include:

  • Exact steps/commands to build your repro
  • Exact steps/commands to run your repro
  • Full traceback of errors encountered

Hi,
Please refer to below links related custom plugin implementation and sample:

While IPluginV2 and IPluginV2Ext interfaces are still supported for backward compatibility with TensorRT 5.1 and 6.0.x respectively, however, we recommend that you write new plugins or refactor existing ones to target the IPluginV2DynamicExt or IPluginV2IOExt interfaces instead.

Thanks!

1 Like

Thanks for the quick response. I will take a look at the sample you gave me.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.