How to convert tensorflow graph to ONNX with a custom plugin replacing a namespace?

Description

I am trying to migrate my code from using uff parser to onnx parser as TensorRT will deprecate uff parser in the future.
In my python uff conversion pipeline, I use graphsurgeon.collapse_namespaces to replace a namespace with a TensorRT custom plugin. Then, the graph will be converted to uff by using uff.from_tensorflow function.

In my python onnx conversion pipeline, I still use the graphsurgeon.collapse_namespace. However, instead of calling the uff.from_tensorflow function, I call tf2onnx.from_graph_def. Unfortunately, this does not work. This is because tf2onnx needs to import the graph_def by calling tf.import_graph_def and obviously my custom plugin op name is not registered in the Tensorflow. I wonder what I should do to resolve this?

Environment

TensorRT Version: 6.0.1
GPU Type: RTX 2080 TI
Nvidia Driver Version: 450
CUDA Version: 10.0
CUDNN Version: 7.6
Operating System + Version: Ubuntu 18.04
Python Version (if applicable): 3.6
TensorFlow Version (if applicable): 1.15
PyTorch Version (if applicable):
Baremetal or Container (if container which image + tag):

Relevant Files

Please attach or include links to any models, data, files, or scripts necessary to reproduce your issue. (Github repo, Google Drive, Dropbox, etc.)

Steps To Reproduce

Explained above

Hi,
Request you to share the ONNX model and the script if not shared already so that we can assist you better.
Alongside you can try few things:

  1. validating your model with the below snippet

check_model.py

import sys
import onnx
filename = yourONNXmodel
model = onnx.load(filename)
onnx.checker.check_model(model).
2) Try running your model with trtexec command.

In case you are still facing issue, request you to share the trtexec “”–verbose"" log for further debugging
Thanks!

Hi @adw,

This looks like Tensorflow related issue. Could please post your query here to get better help.
GitHub - tensorflow/tensorflow: An Open Source Machine Learning Framework for Everyone

Thank you.