Description
I mad a yolov4 tensorrt engine file using cv notebook samples of tao toolkit in python virtual envirionment, and the model was validated by using tao inference. It worked fine.
But When I tried to load in tensorRT. I got the error like this.
<_io.BufferedReader name=‘./trt.engine’>
[03/31/2023-05:23:57] [TRT] [E] 1: [pluginV2Runner.cpp::load::300] Error Code 1: Serialization (Serialization assertion creator failed.Cannot deserialize plugin since corresponding IPluginCreator not found in Plugin Registry)
[03/31/2023-05:23:57] [TRT] [E] 4: [runtime.cpp::deserializeCudaEngine::66] Error Code 4: Internal Error (Engine deserialization failed.)
What is the problem?
Environment
TensorRT Version: 8.5.1
GPU Type: 3090 TI
Nvidia Driver Version: 530.30.02
CUDA Version: 12.1
CUDNN Version: 8.8.1
Operating System + Version: Ubuntu 22.04
Python Version (if applicable): 3.10
TensorFlow Version (if applicable):
PyTorch Version (if applicable):
Baremetal or Container (if container which image + tag):
Relevant Files
Please attach or include links to any models, data, files, or scripts necessary to reproduce your issue. (Github repo, Google Drive, Dropbox, etc.)
Steps To Reproduce
Please include:
- Exact steps/commands to build your repro
- Exact steps/commands to run your repro
- Full traceback of errors encountered