They are inside my model. I have tried with other nodes as well such as detection_scores, detection_boxes, detection_classes and neither of them works.
Do you have a working sample that converts a tensorflow pb file to uff file
jaybdub:
Hi gustavvz,
It seems like the output node names are not in the TensorFlow graph. It may help to use the tensorboard visualization tool to visualize the TensorFlow graph and determine the output name. For example, by running
import keras
import keras.backend as K
import tensorflow as tf
vgg = keras.applications.vgg19.VGG19()
sess = K.get_session()
tf.summary.FileWriter('tensorboard_logdir', sess.graph_def)
You may then visualize the graph by launching $ tensorboard --logdir=tensorboard_logdir. For me, the output node name was âpredictions/Softmaxâ. Using this name I was able to freeze the graph and convert to uff as follows.
import keras
import keras.backend as K
import tensorflow as tf
import uff
output_names = ['predictions/Softmax']
frozen_graph_filename = 'keras_vgg19_frozen_graph.pb'
sess = K.get_session()
# freeze graph and remove training nodes
graph_def = tf.graph_util.convert_variables_to_constants(sess, sess.graph_def, output_names)
graph_def = tf.graph_util.remove_training_nodes(graph_def)
# write frozen graph to file
with open(frozen_graph_filename, 'wb') as f:
f.write(graph_def.SerializeToString())
f.close()
# convert frozen graph to uff
uff_model = uff.from_tensorflow_frozen_model(frozen_graph_filename, output_names)
(Keras 2.1.2, TensorFlow 1.4.1, uff 0.2.0)
Hope this helps.
John
@jaybdub
How to give two output names to the function âuff.from_tensorflow_frozen_modelâ ?
Thank you
Hi, uvindusanda
Try this:
output_names = ['A','B','C',....]
uff_model = uff.from_tensorflow_frozen_model(frozen_graph_filename, output_names)
Thanks.
[/quote]
Hi,
I am trying to convert a tensorflow .pb file to uff file using uff and it failed.
This is on TensorRT 4.1 with cuda 8.0.
import tensorflow as tf
import uff
if name == âmain â:
uff.from_tensorflow(graphdef=â/home/Work/Tensorrt/Model/ssd_inception_v2_coco_2018_01_28/frozen_inference_graph.pbâ,
output_filename=âssd.uffâ,
output_nodes=[âdetection_scoresâ,âdetection_boxesâ, âdetection_classesâ, ânum_detectionsâ])
output:
import pandas.parser as _parser
Using output node detection_scores
Using output node detection_boxes
Using output node detection_classes
Using output node num_detections
Converting to UFF graph
Traceback (most recent call last):
File âconvert.pyâ, line 7, in
output_nodes=[âdetection_scoresâ,âdetection_boxesâ, âdetection_classesâ, ânum_detectionsâ])
File â/usr/lib/python2.7/dist-packages/uff/converters/tensorflow/conversion_helpers.pyâ, line 120, in from_tensorflow
name=âmainâ)
File â/usr/lib/python2.7/dist-packages/uff/converters/tensorflow/converter.pyâ, line 77, in convert_tf2uff_graph
uff_graph, input_replacements)
File â/usr/lib/python2.7/dist-packages/uff/converters/tensorflow/converter.pyâ, line 54, in convert_tf2uff_node
raise UffException(str(name) + " was not found in the graph. Please use the -l option to list nodes in the graph.")
uff.model.exceptions.UffException: num_detections was not found in the graph. Please use the -l option to list nodes in the graph.
I am pretty sure that output node is right. Please help
I was facing the same issue. I realized that the nodes that are not recognized by uff are Identity ops. I put the names of the nodes that produce the Identity nodes to the output_nodes argument to solve the issue.
ysyyork
November 5, 2020, 7:24am
26
Is there any tutorial for tensorflow 2 saved model format? I searched around and only saw topics for frozen graph
Did you find something for tensorflow 2 saved model format?