I’m trying to deploy a retinanet model trained on my custom dataset to nvidia triton inference server tritonserver:21.08-py3. Trained with TAO v3.21.11-tf1.15.4-py3, on a NVIDIA GeForce RTX 3060, But the server keeps failing with error below
E0520 15:04:50.516897 1 logging.cc:43] 3: getPluginCreator could not find plugin: BatchTilePlugin_TRT version: 1
E0520 15:04:50.519766 1 logging.cc:43] 1: [pluginV2Runner.cpp::load::291] Error Code 1: Serialization (Serialization assertion creator failed.Cannot deserialize plugin since corresponding IPluginCreator not found in Plugin Registry)
Solved by converting the .etlt file from “tao export” with a later version of tensorrt and deploy with a later triton container nvcr.io/nvidia/tritonserver:21.11-py3