Deserialise Deepstream built YoloV3 TensorRT engine

Hi,

I’ve run the deepstream-app to generate a .engine file. I’m trying to write a script to use this now for inferencing. I was looking at this documentation: https://docs.nvidia.com/deeplearning/sdk/tensorrt-developer-guide/index.html#perform_inference_python

When I try to deserialize the .engine file for inferencing, I get the following error:

"
[TensorRT] ERROR: getPluginCreator could not find plugin YOloLayerV3_TRT version 1 namespace
[TensorRT] ERROR: Cannot deserialize plugin YoloLayerV3_TRT
"

How can I add this plugin? In the documentation they get a list of plugins by running trt.get_plugin_registry().plugin_creator_list, but there doesn’t appear to be a YoloV3 plugin there.

Any help will be appreciated.

It’s here: sources/objectDetector_Yolo/nvdsinfer_custom_impl_Yolo

Thanks ChrisDing - so if want to deserialise the .engine file, do I first add sources/objectDetector_Yolo/nvdsinfer_custom_impl_Yolo to the plugin registry?

It’s better use deepstream-app to deploy. Follow sources/objectDetector_Yolo/README