Loading ./models/tensorflow_inception_graph.pb Using output node softmax Converting to UFF graph Warning: No conversion function registered for layer: Concat yet. Converting as custom op Concat mixed_10/join name: "mixed_10/join" op: "Concat" input: "mixed_10/join/concat_dim" input: "mixed_10/conv" input: "mixed_10/tower/mixed/conv" input: "mixed_10/tower/mixed/conv_1" input: "mixed_10/tower_1/mixed/conv" input: "mixed_10/tower_1/mixed/conv_1" input: "mixed_10/tower_2/conv" attr { key: "N" value { i: 6 } } attr { key: "T" value { type: DT_FLOAT } } Warning: No conversion function registered for layer: BatchNormWithGlobalNormalization yet. Converting as custom op BatchNormWithGlobalNormalization mixed_10/tower_2/conv/batchnorm name: "mixed_10/tower_2/conv/batchnorm" op: "BatchNormWithGlobalNormalization" input: "mixed_10/tower_2/conv/Conv2D" input: "mixed_10/tower_2/conv/batchnorm/moving_mean" input: "mixed_10/tower_2/conv/batchnorm/moving_variance" input: "mixed_10/tower_2/conv/batchnorm/beta" input: "mixed_10/tower_2/conv/batchnorm/gamma" attr { key: "T" value { type: DT_FLOAT } } attr { key: "scale_after_normalization" value { b: false } } attr { key: "variance_epsilon" value { f: 0.0010000000475 } } Warning: No conversion function registered for layer: Concat yet. Converting as custom op Concat mixed_9/join name: "mixed_9/join" op: "Concat" input: "mixed_9/join/concat_dim" input: "mixed_9/conv" input: "mixed_9/tower/mixed/conv" input: "mixed_9/tower/mixed/conv_1" input: "mixed_9/tower_1/mixed/conv" input: "mixed_9/tower_1/mixed/conv_1" input: "mixed_9/tower_2/conv" attr { key: "N" value { i: 6 } } attr { key: "T" value { type: DT_FLOAT } } Warning: No conversion function registered for layer: BatchNormWithGlobalNormalization yet. Converting as custom op BatchNormWithGlobalNormalization mixed_9/tower_2/conv/batchnorm name: "mixed_9/tower_2/conv/batchnorm" op: "BatchNormWithGlobalNormalization" input: "mixed_9/tower_2/conv/Conv2D" input: "mixed_9/tower_2/conv/batchnorm/moving_mean" input: "mixed_9/tower_2/conv/batchnorm/moving_variance" input: "mixed_9/tower_2/conv/batchnorm/beta" input: "mixed_9/tower_2/conv/batchnorm/gamma" attr { key: "T" value { type: DT_FLOAT } } attr { key: "scale_after_normalization" value { b: false } } attr { key: "variance_epsilon" value { f: 0.0010000000475 } } Warning: No conversion function registered for layer: Concat yet. Converting as custom op Concat mixed_8/join name: "mixed_8/join" op: "Concat" input: "mixed_8/join/concat_dim" input: "mixed_8/tower/conv_1" input: "mixed_8/tower_1/conv_3" input: "mixed_8/pool" attr { key: "N" value { i: 3 } } attr { key: "T" value { type: DT_FLOAT } } Warning: No conversion function registered for layer: Concat yet. Converting as custom op Concat mixed_7/join name: "mixed_7/join" op: "Concat" input: "mixed_7/join/concat_dim" input: "mixed_7/conv" input: "mixed_7/tower/conv_2" input: "mixed_7/tower_1/conv_4" input: "mixed_7/tower_2/conv" attr { key: "N" value { i: 4 } } attr { key: "T" value { type: DT_FLOAT } } Warning: No conversion function registered for layer: BatchNormWithGlobalNormalization yet. Converting as custom op BatchNormWithGlobalNormalization mixed_7/tower_2/conv/batchnorm name: "mixed_7/tower_2/conv/batchnorm" op: "BatchNormWithGlobalNormalization" input: "mixed_7/tower_2/conv/Conv2D" input: "mixed_7/tower_2/conv/batchnorm/moving_mean" input: "mixed_7/tower_2/conv/batchnorm/moving_variance" input: "mixed_7/tower_2/conv/batchnorm/beta" input: "mixed_7/tower_2/conv/batchnorm/gamma" attr { key: "T" value { type: DT_FLOAT } } attr { key: "scale_after_normalization" value { b: false } } attr { key: "variance_epsilon" value { f: 0.0010000000475 } } Warning: No conversion function registered for layer: Concat yet. Converting as custom op Concat mixed_6/join name: "mixed_6/join" op: "Concat" input: "mixed_6/join/concat_dim" input: "mixed_6/conv" input: "mixed_6/tower/conv_2" input: "mixed_6/tower_1/conv_4" input: "mixed_6/tower_2/conv" attr { key: "N" value { i: 4 } } attr { key: "T" value { type: DT_FLOAT } } Warning: No conversion function registered for layer: BatchNormWithGlobalNormalization yet. Converting as custom op BatchNormWithGlobalNormalization mixed_6/tower_2/conv/batchnorm name: "mixed_6/tower_2/conv/batchnorm" op: "BatchNormWithGlobalNormalization" input: "mixed_6/tower_2/conv/Conv2D" input: "mixed_6/tower_2/conv/batchnorm/moving_mean" input: "mixed_6/tower_2/conv/batchnorm/moving_variance" input: "mixed_6/tower_2/conv/batchnorm/beta" input: "mixed_6/tower_2/conv/batchnorm/gamma" attr { key: "T" value { type: DT_FLOAT } } attr { key: "scale_after_normalization" value { b: false } } attr { key: "variance_epsilon" value { f: 0.0010000000475 } } Warning: No conversion function registered for layer: Concat yet. Converting as custom op Concat mixed_5/join name: "mixed_5/join" op: "Concat" input: "mixed_5/join/concat_dim" input: "mixed_5/conv" input: "mixed_5/tower/conv_2" input: "mixed_5/tower_1/conv_4" input: "mixed_5/tower_2/conv" attr { key: "N" value { i: 4 } } attr { key: "T" value { type: DT_FLOAT } } Warning: No conversion function registered for layer: BatchNormWithGlobalNormalization yet. Converting as custom op BatchNormWithGlobalNormalization mixed_5/tower_2/conv/batchnorm name: "mixed_5/tower_2/conv/batchnorm" op: "BatchNormWithGlobalNormalization" input: "mixed_5/tower_2/conv/Conv2D" input: "mixed_5/tower_2/conv/batchnorm/moving_mean" input: "mixed_5/tower_2/conv/batchnorm/moving_variance" input: "mixed_5/tower_2/conv/batchnorm/beta" input: "mixed_5/tower_2/conv/batchnorm/gamma" attr { key: "T" value { type: DT_FLOAT } } attr { key: "scale_after_normalization" value { b: false } } attr { key: "variance_epsilon" value { f: 0.0010000000475 } } Warning: No conversion function registered for layer: Concat yet. Converting as custom op Concat mixed_4/join name: "mixed_4/join" op: "Concat" input: "mixed_4/join/concat_dim" input: "mixed_4/conv" input: "mixed_4/tower/conv_2" input: "mixed_4/tower_1/conv_4" input: "mixed_4/tower_2/conv" attr { key: "N" value { i: 4 } } attr { key: "T" value { type: DT_FLOAT } } Warning: No conversion function registered for layer: BatchNormWithGlobalNormalization yet. Converting as custom op BatchNormWithGlobalNormalization mixed_4/tower_2/conv/batchnorm name: "mixed_4/tower_2/conv/batchnorm" op: "BatchNormWithGlobalNormalization" input: "mixed_4/tower_2/conv/Conv2D" input: "mixed_4/tower_2/conv/batchnorm/moving_mean" input: "mixed_4/tower_2/conv/batchnorm/moving_variance" input: "mixed_4/tower_2/conv/batchnorm/beta" input: "mixed_4/tower_2/conv/batchnorm/gamma" attr { key: "T" value { type: DT_FLOAT } } attr { key: "scale_after_normalization" value { b: false } } attr { key: "variance_epsilon" value { f: 0.0010000000475 } } Warning: No conversion function registered for layer: Concat yet. Converting as custom op Concat mixed_3/join name: "mixed_3/join" op: "Concat" input: "mixed_3/join/concat_dim" input: "mixed_3/conv" input: "mixed_3/tower/conv_2" input: "mixed_3/pool" attr { key: "N" value { i: 3 } } attr { key: "T" value { type: DT_FLOAT } } Warning: No conversion function registered for layer: Concat yet. Converting as custom op Concat mixed_2/join name: "mixed_2/join" op: "Concat" input: "mixed_2/join/concat_dim" input: "mixed_2/conv" input: "mixed_2/tower/conv_1" input: "mixed_2/tower_1/conv_2" input: "mixed_2/tower_2/conv" attr { key: "N" value { i: 4 } } attr { key: "T" value { type: DT_FLOAT } } Warning: No conversion function registered for layer: BatchNormWithGlobalNormalization yet. Converting as custom op BatchNormWithGlobalNormalization mixed_2/tower_2/conv/batchnorm name: "mixed_2/tower_2/conv/batchnorm" op: "BatchNormWithGlobalNormalization" input: "mixed_2/tower_2/conv/Conv2D" input: "mixed_2/tower_2/conv/batchnorm/moving_mean" input: "mixed_2/tower_2/conv/batchnorm/moving_variance" input: "mixed_2/tower_2/conv/batchnorm/beta" input: "mixed_2/tower_2/conv/batchnorm/gamma" attr { key: "T" value { type: DT_FLOAT } } attr { key: "scale_after_normalization" value { b: false } } attr { key: "variance_epsilon" value { f: 0.0010000000475 } } Warning: No conversion function registered for layer: Concat yet. Converting as custom op Concat mixed_1/join name: "mixed_1/join" op: "Concat" input: "mixed_1/join/concat_dim" input: "mixed_1/conv" input: "mixed_1/tower/conv_1" input: "mixed_1/tower_1/conv_2" input: "mixed_1/tower_2/conv" attr { key: "N" value { i: 4 } } attr { key: "T" value { type: DT_FLOAT } } Warning: No conversion function registered for layer: BatchNormWithGlobalNormalization yet. Converting as custom op BatchNormWithGlobalNormalization mixed_1/tower_2/conv/batchnorm name: "mixed_1/tower_2/conv/batchnorm" op: "BatchNormWithGlobalNormalization" input: "mixed_1/tower_2/conv/Conv2D" input: "mixed_1/tower_2/conv/batchnorm/moving_mean" input: "mixed_1/tower_2/conv/batchnorm/moving_variance" input: "mixed_1/tower_2/conv/batchnorm/beta" input: "mixed_1/tower_2/conv/batchnorm/gamma" attr { key: "T" value { type: DT_FLOAT } } attr { key: "scale_after_normalization" value { b: false } } attr { key: "variance_epsilon" value { f: 0.0010000000475 } } Warning: No conversion function registered for layer: Concat yet. Converting as custom op Concat mixed/join name: "mixed/join" op: "Concat" input: "mixed/join/concat_dim" input: "mixed/conv" input: "mixed/tower/conv_1" input: "mixed/tower_1/conv_2" input: "mixed/tower_2/conv" attr { key: "N" value { i: 4 } } attr { key: "T" value { type: DT_FLOAT } } Warning: No conversion function registered for layer: BatchNormWithGlobalNormalization yet. Converting as custom op BatchNormWithGlobalNormalization mixed/tower_2/conv/batchnorm name: "mixed/tower_2/conv/batchnorm" op: "BatchNormWithGlobalNormalization" input: "mixed/tower_2/conv/Conv2D" input: "mixed/tower_2/conv/batchnorm/moving_mean" input: "mixed/tower_2/conv/batchnorm/moving_variance" input: "mixed/tower_2/conv/batchnorm/beta" input: "mixed/tower_2/conv/batchnorm/gamma" attr { key: "T" value { type: DT_FLOAT } } attr { key: "scale_after_normalization" value { b: false } } attr { key: "variance_epsilon" value { f: 0.0010000000475 } } Warning: No conversion function registered for layer: BatchNormWithGlobalNormalization yet. Converting as custom op BatchNormWithGlobalNormalization conv_4/batchnorm name: "conv_4/batchnorm" op: "BatchNormWithGlobalNormalization" input: "conv_4/Conv2D" input: "conv_4/batchnorm/moving_mean" input: "conv_4/batchnorm/moving_variance" input: "conv_4/batchnorm/beta" input: "conv_4/batchnorm/gamma" attr { key: "T" value { type: DT_FLOAT } } attr { key: "scale_after_normalization" value { b: false } } attr { key: "variance_epsilon" value { f: 0.0010000000475 } } Warning: No conversion function registered for layer: BatchNormWithGlobalNormalization yet. Converting as custom op BatchNormWithGlobalNormalization conv_3/batchnorm name: "conv_3/batchnorm" op: "BatchNormWithGlobalNormalization" input: "conv_3/Conv2D" input: "conv_3/batchnorm/moving_mean" input: "conv_3/batchnorm/moving_variance" input: "conv_3/batchnorm/beta" input: "conv_3/batchnorm/gamma" attr { key: "T" value { type: DT_FLOAT } } attr { key: "scale_after_normalization" value { b: false } } attr { key: "variance_epsilon" value { f: 0.0010000000475 } } Warning: No conversion function registered for layer: BatchNormWithGlobalNormalization yet. Converting as custom op BatchNormWithGlobalNormalization conv_2/batchnorm name: "conv_2/batchnorm" op: "BatchNormWithGlobalNormalization" input: "conv_2/Conv2D" input: "conv_2/batchnorm/moving_mean" input: "conv_2/batchnorm/moving_variance" input: "conv_2/batchnorm/beta" input: "conv_2/batchnorm/gamma" attr { key: "T" value { type: DT_FLOAT } } attr { key: "scale_after_normalization" value { b: false } } attr { key: "variance_epsilon" value { f: 0.0010000000475 } } Warning: No conversion function registered for layer: BatchNormWithGlobalNormalization yet. Converting as custom op BatchNormWithGlobalNormalization conv_1/batchnorm name: "conv_1/batchnorm" op: "BatchNormWithGlobalNormalization" input: "conv_1/Conv2D" input: "conv_1/batchnorm/moving_mean" input: "conv_1/batchnorm/moving_variance" input: "conv_1/batchnorm/beta" input: "conv_1/batchnorm/gamma" attr { key: "T" value { type: DT_FLOAT } } attr { key: "scale_after_normalization" value { b: false } } attr { key: "variance_epsilon" value { f: 0.0010000000475 } } Warning: No conversion function registered for layer: BatchNormWithGlobalNormalization yet. Converting as custom op BatchNormWithGlobalNormalization conv/batchnorm name: "conv/batchnorm" op: "BatchNormWithGlobalNormalization" input: "conv/Conv2D" input: "conv/batchnorm/moving_mean" input: "conv/batchnorm/moving_variance" input: "conv/batchnorm/beta" input: "conv/batchnorm/gamma" attr { key: "T" value { type: DT_FLOAT } } attr { key: "scale_after_normalization" value { b: false } } attr { key: "variance_epsilon" value { f: 0.0010000000475 } } Warning: No conversion function registered for layer: ResizeBilinear yet. Converting as custom op ResizeBilinear ResizeBilinear name: "ResizeBilinear" op: "ResizeBilinear" input: "ExpandDims" input: "ResizeBilinear/size" attr { key: "T" value { type: DT_FLOAT } } Warning: No conversion function registered for layer: ExpandDims yet. Converting as custom op ExpandDims ExpandDims name: "ExpandDims" op: "ExpandDims" input: "Cast" input: "ExpandDims/dim" attr { key: "T" value { type: DT_FLOAT } } Warning: No conversion function registered for layer: Cast yet. Converting as custom op Cast Cast name: "Cast" op: "Cast" input: "DecodeJpeg" attr { key: "DstT" value { type: DT_FLOAT } } attr { key: "SrcT" value { type: DT_UINT8 } } Warning: No conversion function registered for layer: DecodeJpeg yet. Converting as custom op DecodeJpeg DecodeJpeg name: "DecodeJpeg" op: "DecodeJpeg" input: "DecodeJpeg/contents" attr { key: "acceptable_fraction" value { f: 1.0 } } attr { key: "channels" value { i: 3 } } attr { key: "fancy_upscaling" value { b: true } } attr { key: "ratio" value { i: 1 } } attr { key: "try_recover_truncated" value { b: false } } Traceback (most recent call last): File "/usr/local/bin/convert-to-uff", line 11, in sys.exit(main()) File "/usr/local/lib/python2.7/dist-packages/uff/bin/convert_to_uff.py", line 104, in main output_filename=args.output File "/usr/local/lib/python2.7/dist-packages/uff/converters/tensorflow/conversion_helpers.py", line 103, in from_tensorflow_frozen_model return from_tensorflow(graphdef, output_nodes, **kwargs) File "/usr/local/lib/python2.7/dist-packages/uff/converters/tensorflow/conversion_helpers.py", line 75, in from_tensorflow name="main") File "/usr/local/lib/python2.7/dist-packages/uff/converters/tensorflow/converter.py", line 64, in convert_tf2uff_graph uff_graph, input_replacements) File "/usr/local/lib/python2.7/dist-packages/uff/converters/tensorflow/converter.py", line 51, in convert_tf2uff_node op, name, tf_node, inputs, uff_graph, tf_nodes=tf_nodes) File "/usr/local/lib/python2.7/dist-packages/uff/converters/tensorflow/converter.py", line 32, in convert_layer return cls.registry_[op](name, tf_node, inputs, uff_graph, **kwargs) File "/usr/local/lib/python2.7/dist-packages/uff/converters/tensorflow/converter_functions.py", line 26, in convert_const array = tf2uff.convert_tf2numpy_const_node(tf_node) File "/usr/local/lib/python2.7/dist-packages/uff/converters/tensorflow/converter.py", line 99, in convert_tf2numpy_const_node array = np.fromstring(data, dtype=np_dtype) ValueError: zero-valued itemsize