I have already successfully build the inference engine from prototxt which include plugin layers.Yes, I wrote he plugin layer class and plugin factory class. Now I would like to use Deepstream sdk to build a task. I found deepstream use the tensorrt .so as the lib to parse the prototxt. however it seems the addInferenceTask can not parse the plugin layer? What can i do to enable the deepstream to support plugin layer?
We have the same problem with 2 network. In the first one, we has removed all custom layer from .prototxt, you need to create equation merge of tensorrt supported layer. If you can’t, then: With the second netowrk, we just create a deepstream plugin to using tensorrt to replace deepstream internal inference module. Deepstream hide pluginfactory which present in tensorrt caffeparser.
As haifengli said, we don’t support plugin API in DeepStream TensorRT-based inference module.
To enable plugin support, please define a custom module via DeepStream plug-in mechanism.
Check our document for details:
>> 3.2 PLUG-IN MECHANISM