I am using RefineDet Net.I want to accelerate inference, but in RefineDet,the DetectionOutput layer is modified.so it isn’t supported in TensorRT 5.0.
How to implement a plugin to support the new DetectionOutput layer?
I am using RefineDet Net.I want to accelerate inference, but in RefineDet,the DetectionOutput layer is modified.so it isn’t supported in TensorRT 5.0.
How to implement a plugin to support the new DetectionOutput layer?
Please reference Extending TensorRT With Custom Layers
I try to define a custom plugin to support RefineDet detection output layer.
I referred to sample_plugin.
When I set the custom plugin factory to the parser, it’s not running.The custon plugin is not used! The plugin factory is not working!
// Create the builder
IBuilder* builder = createInferBuilder(gLogger);
// Parse the caffe model to populate the network, then set the outputs
INetworkDefinition* network = builder->createNetwork();
ICaffeParser* parser = createCaffeParser();
parser->setPluginFactoryExt(pluginFactory);
…
const IBlobNameToTensor* blobNameToTensor = parser->parse(locateFile(deployFile).c_str(),
locateFile(modelFile).c_str(),
*network,
dataType);
Please reference tensorrt/samples/samplePlugin
https://docs.nvidia.com/deeplearning/sdk/tensorrt-developer-guide/index.html#plugin_sample
I removed the objectness_score param that leads to errors, It’s OK!