I am trying to deserialize the rt engine file created from tkDNN.
I tried deserializing it using sample python API, I am facing following issue.
[TensorRT] ERROR: deserializationUtils.cpp (635) - Serialization Error in load: 0 (Serialized engine contains plugin, but no plugin factory was provided. To deserialize an engine without a factory, please use IPluginV2 instead.)
[TensorRT] ERROR: INVALID_STATE: std::exception
[TensorRT] ERROR: INVALID_CONFIG: Deserialize the cuda engine failed.
TensorRT Version: 7.2.1,7.0.0,7.1.3,6.0.1
GPU Type: Tesla T4
Nvidia Driver Version: 455.32
CUDA Version: 11.1,10.2,11.1,10.2
CUDNN Version: 8.0.4,7.6.5,8.0.4,7.6.5
Operating System + Version: Ubuntu 18.04
Python Version (if applicable): 3.6
TensorFlow Version (if applicable):
PyTorch Version (if applicable):
Baremetal or Container (if container which image + tag):
using ngc containers for each corresponding tensorrt version
Please attach or include links to any models, data, files, or scripts necessary to reproduce your issue. (Github repo, Google Drive, Dropbox, etc.)
Steps To Reproduce
- Exact steps/commands to build your repro
same steps in readme for tkdnn for yolov4-csp
- Exact steps/commands to run your repro
- Full traceback of errors encountered