I am trying to migrate my code from using uff parser to onnx parser as TensorRT will deprecate uff parser in the future.
In my python uff conversion pipeline, I use graphsurgeon.collapse_namespaces to replace a namespace with a TensorRT custom plugin. Then, the graph will be converted to uff by using uff.from_tensorflow function.
In my python onnx conversion pipeline, I still use the graphsurgeon.collapse_namespace. However, instead of calling the uff.from_tensorflow function, I call tf2onnx.from_graph_def. Unfortunately, this does not work. This is because tf2onnx needs to import the graph_def by calling tf.import_graph_def and obviously my custom plugin op name is not registered in the Tensorflow. I wonder what I should do to resolve this?
TensorRT Version: 6.0.1
GPU Type: RTX 2080 TI
Nvidia Driver Version: 450
CUDA Version: 10.0
CUDNN Version: 7.6
Operating System + Version: Ubuntu 18.04
Python Version (if applicable): 3.6
TensorFlow Version (if applicable): 1.15
PyTorch Version (if applicable):
Baremetal or Container (if container which image + tag):
Please attach or include links to any models, data, files, or scripts necessary to reproduce your issue. (Github repo, Google Drive, Dropbox, etc.)