Converting TensorFlow model with Fake Quant Nodes

I am trying to convert a trained Mobilenet V2 TensorFlow model to a UFF using the convert-to-uff binary. I have trained two examples of this model:

  1. with fake quantization nodes using tf.contrib.quantize.create_training_graph()
  2. without fake quantization nodes.

With some special care (related to the batch_norm and fake quant nodes), both models can be successfully frozen.

Once frozen, I am able to convert the graph without fake quant nodes to a .uff format. However, this fails for the fake quant graph.

The error occurs for

node =  id: "MobilenetV2/Logits/Conv2d_1c_1x1/act_quant/AssignMaxEma/MobilenetV2/Logits/Conv2d_1c_1x1/act_quant/max/Pow"

with input

inputs: "MobilenetV2/Logits/Conv2d_1c_1x1/act_quant/AssignMaxEma/MobilenetV2/Logits/Conv2d_1c_1x1/act_quant/max/sub_1"

The trace back looks like this

Traceback (most recent call last):
  File "/home/dave/miniconda2/envs/tf/bin/convert-to-uff", line 11, in <module>
    sys.exit(main())
  File "/home/dave/miniconda2/envs/tf/lib/python3.5/site-packages/uff/bin/convert_to_uff.py", line 89, in main
    debug_mode=args.debug
  File "/home/dave/miniconda2/envs/tf/lib/python3.5/site-packages/uff/converters/tensorflow/conversion_helpers.py", line 187, in from_tensorflow_frozen_model
    return from_tensorflow(graphdef, output_nodes, preprocessor, **kwargs)
  File "/home/dave/miniconda2/envs/tf/lib/python3.5/site-packages/uff/converters/tensorflow/conversion_helpers.py", line 159, in from_tensorflow
    uff_metagraph_proto = uff_metagraph.to_uff()
  File "/home/dave/miniconda2/envs/tf/lib/python3.5/site-packages/uff/model/meta_graph.py", line 39, in to_uff
    graphs=[graph.to_uff(debug) for graph in self.graphs],
  File "/home/dave/miniconda2/envs/tf/lib/python3.5/site-packages/uff/model/meta_graph.py", line 39, in <listcomp>
    graphs=[graph.to_uff(debug) for graph in self.graphs],
  File "/home/dave/miniconda2/envs/tf/lib/python3.5/site-packages/uff/model/graph.py", line 26, in to_uff
    graph = uff_pb.Graph(id=self.name, nodes=self._check_graph_and_get_nodes())
  File "/home/dave/miniconda2/envs/tf/lib/python3.5/site-packages/uff/model/graph.py", line 46, in _check_graph_and_get_nodes
    raise extend_with_original_traceback(e, node._trace)
  File "/home/dave/miniconda2/envs/tf/lib/python3.5/site-packages/uff/model/graph.py", line 44, in _check_graph_and_get_nodes
    nodes.append(self._check_and_get_node(node))
  File "/home/dave/miniconda2/envs/tf/lib/python3.5/site-packages/uff/model/graph.py", line 37, in _check_and_get_node
    self.meta_graph.descriptor.check_node(node, self.meta_graph.referenced_data)
  File "/home/dave/miniconda2/envs/tf/lib/python3.5/site-packages/uff/model/descriptor.py", line 298, in check_node
    return self._desc_ops[node.operation]._check_node(node, fields, extra_fields)
  File "/home/dave/miniconda2/envs/tf/lib/python3.5/site-packages/uff/model/descriptor.py", line 214, in _check_node
    err = constraint(node, fields, extra_fields, shared_mem)
  File "/home/dave/miniconda2/envs/tf/lib/python3.5/site-packages/uff/model/descriptor.py", line 31, in __call__
    return self.func(*args, **kwargs)
  File "/home/dave/miniconda2/envs/tf/lib/python3.5/site-packages/uff/model/descriptor.py", line 93, in _constraint
    raise error
uff.model.exceptions.UffException: func had bad type or is not present

Originally defined at:
  File "/home/dave/miniconda2/envs/tf/bin/convert-to-uff", line 11, in <module>
    sys.exit(main())
  File "/home/dave/miniconda2/envs/tf/lib/python3.5/site-packages/uff/bin/convert_to_uff.py", line 89, in main
    debug_mode=args.debug
  File "/home/dave/miniconda2/envs/tf/lib/python3.5/site-packages/uff/converters/tensorflow/conversion_helpers.py", line 187, in from_tensorflow_frozen_model
    return from_tensorflow(graphdef, output_nodes, preprocessor, **kwargs)
  File "/home/dave/miniconda2/envs/tf/lib/python3.5/site-packages/uff/converters/tensorflow/conversion_helpers.py", line 157, in from_tensorflow
    debug_mode=debug_mode)
  File "/home/dave/miniconda2/envs/tf/lib/python3.5/site-packages/uff/converters/tensorflow/converter.py", line 94, in convert_tf2uff_graph
    uff_graph, input_replacements, debug_mode=debug_mode)
  File "/home/dave/miniconda2/envs/tf/lib/python3.5/site-packages/uff/converters/tensorflow/converter.py", line 79, in convert_tf2uff_node
    op, name, tf_node, inputs, uff_graph, tf_nodes=tf_nodes, debug_mode=debug_mode)
  File "/home/dave/miniconda2/envs/tf/lib/python3.5/site-packages/uff/converters/tensorflow/converter.py", line 47, in convert_layer
    return cls.registry_[op](name, tf_node, inputs, uff_graph, **kwargs)
  File "/home/dave/miniconda2/envs/tf/lib/python3.5/site-packages/uff/converters/tensorflow/converter_functions.py", line 107, in convert_pow
    uff_graph.unary(inputs[0], 'pow', name)
  File "/home/dave/miniconda2/envs/tf/lib/python3.5/site-packages/uff/model/graph.py", line 174, in unary
    fields=fields, extra_fields=extra_fields)
  File "/home/dave/miniconda2/envs/tf/lib/python3.5/site-packages/uff/model/graph.py", line 65, in _add_node
    node = Node(self, op, name, inputs, fields, extra_fields)

hi,could you tell me how to successfully freeze the model with fake quantization nodes.
I’m using the model of MobilenetV2 in slim.After training with operation tf.contrib.quantize.create_training_graph. when I freeze the model, it occurs the error like
attempting to use uninitialized value MobilennetV2/expanded_conv_6/depthwise/weights_quant/max ,MobilenetV2/expanded_conv_14/post_activation_bypass_quant/min
so I want to konw how you did it. thank you!

Please try placing the following code after your restore operation

init_op = tf.initialize_all_variables()
sess = tf.Session()
sess.run(init_op)