Error: Cannot deserialize plugin since corresponding IPluginCreator not found in Plugin Registry with Yolov4 TensorRT engine

Description

I trained a Yolov4 model with TAO toolkit and exported it to .etlt model.
I wanted to run inference with TensorRT Python API.

  1. I’m using tensorrt container verision 22.01.
  2. Installed TensorRT OSS using the script /opt/tensorrt/install_opensource.sh within the container.
  3. Downloaded the tao-converter version 3.22.05_trt8.2_x86 and converted the yolov4 etlt model to engine with below command:
./tao-converter ../models/ppe_v21_yolov4_mobilenetv2_epoch_070.etlt -k nvidia_tlt -d 1,3,160,320 -t fp16 -e ../engines/ppe_yolov4_mobilenetv2_epoch70_fp16.engine -p Input,1x3x320x160,8x3x320x160,16x3x320x160 -o BatchedNMS

Able to generate engine with the above command. But I’m unable to deserialize the engine when I try to do inference with Python API. Attached the code to run inference on the generated engine.

trt_loader.py (9.3 KB)

Environment

TensorRT Version: 8.2.2
GPU Type: RTX 3070
Nvidia Driver Version: 530.30.02
CUDA Version: 11.6
CUDNN Version: 8.3.2
Operating System + Version: Ubuntu 20.04
Python Version (if applicable): 3.8
TensorFlow Version (if applicable): NA
PyTorch Version (if applicable): NA
Baremetal or Container (if container which image + tag): Container version 22.01

Steps To Reproduce

Please include:

  • Full traceback of errors encountered
[04/13/2023-13:01:08] [TRT] [I] [MemUsageChange] Init CUDA: CPU +750, GPU +0, now: CPU 785, GPU 842 (MiB)
[04/13/2023-13:01:08] [TRT] [I] Loaded engine size: 4 MiB
[04/13/2023-13:01:08] [TRT] [E] 1: [pluginV2Runner.cpp::load::290] Error Code 1: Serialization (Serialization assertion creator failed.Cannot deserialize plugin since corresponding IPluginCreator not found in Plugin Registry)
[04/13/2023-13:01:08] [TRT] [E] 4: [runtime.cpp::deserializeCudaEngine::50] Error Code 4: Internal Error (Engine deserialization failed.)
Traceback (most recent call last):

Kindly help me fix this issue.

1 Like

Hi,
Please refer to below links related custom plugin implementation and sample:

While IPluginV2 and IPluginV2Ext interfaces are still supported for backward compatibility with TensorRT 5.1 and 6.0.x respectively, however, we recommend that you write new plugins or refactor existing ones to target the IPluginV2DynamicExt or IPluginV2IOExt interfaces instead.

Thanks!

Hi,
Thanks for your reply. I think yolov4 model trained on TAO toolkit just requires installation of TensorRT OSS. Please refer this link:
https://docs.nvidia.com/metropolis/TLT/tlt-user-guide/text/object_detection/yolo_v4.html#tensorrt-open-source-software-oss
There isn’t any mention on writing the custom plugin for this model.

Hi Team, Any lead on this please.

Hi @arivarasan.e ,
Are you registering the plugins before deserializing TRT engine in python?
Maybe you should add following code before deserializing engine.
trt.init_libnvinfer_plugins()

Thanks

2 Likes

This resolved the issue. Thanks for the help :)

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.