Hi, I am new to Tensorrt and I am not so familiar with C language also. May I ask if there is any example to import caffe model(caffeparser) and at the same time to use plugin with python. Plugin library example: “https://docs.nvidia.com/deeplearning/sdk/tensorrt-api/c_api/_nv_infer_plugin_8h_source.html”.
I saw an example doing something like the below. Is it necessary to modify the the pluginfactory class? or it has been already done(built-in) with the python plugin api?
import tensorrt import tensorrtplugins from tensorrt.plugins import _nv_infer_plugin_bindings as nvinferplugin from tensorrt.parsers import caffeparser plugin_factory = tensorrtplugins.FullyConnectedPluginFactory() parser = caffeparser.create_caffe_parser() parser.set_plugin_factory(plugin_factory) engine = trt.utils.caffe_to_trt_engine(G_LOGGER, MODEL_PROTOTXT, CAFFE_MODEL, 1, 1 << 20, OUTPUT_LAYERS, trt.infer.DataType.FLOAT, plugin_factory )
P.s: I am trying to convert YOLO2 to Tensorrt format. Therefore, some layers(e.g kYOLOREORG and kPRELU) can only be supported by the plugin.
Another way to do so is to add the plugin during constructing the network, by method network.add_plugin_ext() ?However, I am not so sure how to specify the previous layer that is going to be imported later.
Thank you so much for your answer. Your help will be much appreciated!