The fp file for amp with TF_AUTO_MIXED_PRECISION_GRAPH_REWRITE_LOG_PATH open can't convert to summar...

I set the following environment variable:
os.environ[‘TF_CPP_VMODULE’]=“auto_mixed_precision=2”
os.environ[‘TF_AUTO_MIXED_PRECISION_GRAPH_REWRITE_LOG_PATH’]=‘/home/work/log/fp16verbose’

and got pb file like graphdef_AutoMixedPrecision_tf_graph_1566009745594.pb with amp fp16 node inside.
I want to have a look at the graph in tensorboard. so I use the tool to convert it.

python -m tensorflow.python.tools.import_pb_to_tensorboard --model_dir="graphdef_AutoMixedPrecision_tf_graph_1566009745594.pb" --log_dir="."

but the following exception occurs:
ValueError: Node ‘model/get_train_op/gradients/model/Transformer/encode/encoder_stack/layer_0/self_attention/layer_normalization/sub_1_grad/Shape_1’ expects to be colocated with unknown node ‘model/Transformer/encode/encoder_stack/layer_0/self_attention/layer_normalization/sub_1’

How to solve and get the summary for the tensorboard?

Please use the following code snippet to remove colocation attributes from the graphdef:

def _remove_colocation_attrs(graph_def):
  for node in graph_def.node:
    if '_class' in node.attr:
      del node.attr['_class']

e.g., in the import_pb_to_tensorboard.py script it can be called here:

graph_def = graph_pb2.GraphDef()
       graph_def.ParseFromString(f.read())

+      # WAR for issues loading optimized graphs.
+      _remove_colocation_attrs(graph_def)

       importer.import_graph_def(graph_def, name='')