Conversion to uff from tensorflow not working

Hi!

I’m trying to use a trained tensorflow model. I frooze the graph, containing constructs like:

for layer_i, shape in enumerate(shapes):
with tf.variable_scope(“decoder/layer/{}”.format(layer_i)):

    s = Ws[layer_i].get_shape().as_list()
    W = tf.get_variable(
        name='W',
        shape=[
            s[0],
            s[1],
            s[2],
            s[3]],
        initializer=tf.random_normal_initializer(mean=0.0, stddev=0.02))
    
    pad = paddings[layer_i]

    h = tf.nn.conv2d_transpose(current_input, W,
        tf.stack([tf.shape(X)[0], shape[1], shape[2], shape[3]]),
        strides=[1, 1, pad, pad], padding='SAME',data_format='NCHW') 

========================================================================
When I use convert-to-uff it fails with a a KeyError: ‘decoder/layer/6/W/read’
Listing the nodes confirms that this name is not part of the graph, that just gives ‘decoder/layer/6/W’

Is this a bug? And what can I do to convert my model?
Any help is appreciated.

Regards,

Roos
p.s. I work on windows (10), using tensorflow 1.10 and tensorRT 5.

Hi!

I solved it. It is an undocumented feature. You also have to call:

tf.graph_util.remove_training_nodes

before you save to tf model.

Bye!