The fp file for amp with TF_AUTO_MIXED_PRECISION_GRAPH_REWRITE_LOG_PATH open can't convert to summar...

I set the following environment variable:

and got pb file like graphdef_AutoMixedPrecision_tf_graph_1566009745594.pb with amp fp16 node inside.
I want to have a look at the graph in tensorboard. so I use the tool to convert it.

python -m --model_dir="graphdef_AutoMixedPrecision_tf_graph_1566009745594.pb" --log_dir="."

but the following exception occurs:
ValueError: Node ‘model/get_train_op/gradients/model/Transformer/encode/encoder_stack/layer_0/self_attention/layer_normalization/sub_1_grad/Shape_1’ expects to be colocated with unknown node ‘model/Transformer/encode/encoder_stack/layer_0/self_attention/layer_normalization/sub_1’

How to solve and get the summary for the tensorboard?

Please use the following code snippet to remove colocation attributes from the graphdef:

def _remove_colocation_attrs(graph_def):
  for node in graph_def.node:
    if '_class' in node.attr:
      del node.attr['_class']

e.g., in the script it can be called here:

graph_def = graph_pb2.GraphDef()

+      # WAR for issues loading optimized graphs.
+      _remove_colocation_attrs(graph_def)

       importer.import_graph_def(graph_def, name='')