E0310 07:37:43.295347 1 logging.cc:43] 1: [stdArchiveReader.cpp::StdArchiveReader::54] Error Code 1: Serialization (Serialization assertion sizeRead

• Hardware Platform ( A10 PCIe )
• NVIDIA GPU Driver Version - 510.85.02
• CUDA 11.6
• TENSORRT8.2.3

An error is reported when deploying the model in tritonserver.The following is the error report content

E0310 07:37:43.295347 1 logging.cc:43] 1: [stdArchiveReader.cpp::StdArchiveReader::54] Error Code 1: Serialization (Serialization assertion sizeRead == static_cast<uint64_t>(mEnd - mCurrent) failed.Size specified in header does not match archive size)
E0310 07:37:43.295377 1 logging.cc:43] 4: [runtime.cpp::deserializeCudaEngine::50] Error Code 4: Internal Error (Engine deserialization failed.)
I0310 07:37:43.307673 1 tensorrt.cc:5343] TRITONBACKEND_ModelInstanceFinalize: delete instance state
I0310 07:37:43.307731 1 tensorrt.cc:5282] TRITONBACKEND_ModelFinalize: delete model state
E0310 07:37:43.308175 1 model_repository_manager.cc:1152] failed to load ‘firesmoke5’ version 1: Internal: unable to create TensorRT engine

Hi,
Please refer to below links related custom plugin implementation and sample:

While IPluginV2 and IPluginV2Ext interfaces are still supported for backward compatibility with TensorRT 5.1 and 6.0.x respectively, however, we recommend that you write new plugins or refactor existing ones to target the IPluginV2DynamicExt or IPluginV2IOExt interfaces instead.

Thanks!

thank you
This issue has been resolved because the environment settings were incorrect during Tensorrt installation. The issue was resolved after I reset the environment