Description
When I use addlayer to use tensorrt, I can plugin a custom layer through “ipluginv2” and “plugincreator” interfaces. When I use onnx parser to use tensorrt, can I only plugin a custom layer through “ipluginv2” and “pluginfactory” interfaces?
If only “pluginfactory” can be used, the “createplugin” function of this class can only pass “× weights” and “nbweights”, how can I pass other parameters?
Environment
TensorRT Version: tensorrt 5.1.4 in drive software 10
GPU Type: drive xavier
Nvidia Driver Version:
CUDA Version: 10.2
CUDNN Version: 7.6
Operating System + Version: ubuntu
Python Version (if applicable):
TensorFlow Version (if applicable):
PyTorch Version (if applicable):
Baremetal or Container (if container which image + tag):
Relevant Files
Please attach or include links to any models, data, files, or scripts necessary to reproduce your issue. (Github repo, Google Drive, Dropbox, etc.)
Steps To Reproduce
Please include:
- Exact steps/commands to build your repro
- Exact steps/commands to run your repro
- Full traceback of errors encountered