I am trying to deserialize the rt engine file created from tkDNN.
I tried deserializing it using sample python API, I am facing following issue.
[TensorRT] ERROR: deserializationUtils.cpp (635) - Serialization Error in load: 0 (Serialized engine contains plugin, but no plugin factory was provided. To deserialize an engine without a factory, please use IPluginV2 instead.)
[TensorRT] ERROR: INVALID_STATE: std::exception
[TensorRT] ERROR: INVALID_CONFIG: Deserialize the cuda engine failed.
Environment
TensorRT Version: 7.2.1,7.0.0,7.1.3,6.0.1 GPU Type: Tesla T4 Nvidia Driver Version: 455.32 CUDA Version: 11.1,10.2,11.1,10.2 CUDNN Version: 8.0.4,7.6.5,8.0.4,7.6.5 Operating System + Version: Ubuntu 18.04 Python Version (if applicable): 3.6 TensorFlow Version (if applicable): PyTorch Version (if applicable): Baremetal or Container (if container which image + tag):
using ngc containers for each corresponding tensorrt version
Relevant Files
Please attach or include links to any models, data, files, or scripts necessary to reproduce your issue. (Github repo, Google Drive, Dropbox, etc.)
Steps To Reproduce
Please include:
Exact steps/commands to build your repro
same steps in readme for tkdnn for yolov4-csp
While building the engine, I faced this issue.
ERROR LOG:
[TensorRT] ERROR: deserializationUtils.cpp (635) - Serialization Error in load: 0 (Serialized engine contains plugin, but no plugin factory was provided. To deserialize an engine without a factory, please use IPluginV2 instead.)
[TensorRT] ERROR: INVALID_STATE: std::exception
[TensorRT] ERROR: INVALID_CONFIG: Deserialize the cuda engine failed.
Please reply if you need anymore input from my side. Thanks!
Sorry for late response. We tried reproducing the issue, but facing some setup related issues and not sure from tkDNN repo which section you are following.
Could you please share us the model file and script you are using.