TensorRT 3.0 RC convert_to_uff tool doesn't support -I option

1. convert_to_uff tool can’t launch without specified full convert_to_uff.py file.
2. event I run

python /usr/local/lib/python2.7/dist-packages/uff/bin/convert_to_uff.py tensorflow -o ./hello_world.uff -t --input-file ./hello_world_frozen.pb -O stream0/classifier/logits/BiasAdd -I input_images_ph

I got

Loading ./hello_world_frozen.pb
Using output node stream0/classifier/logits/BiasAdd
Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/uff/bin/convert_to_uff.py", line 109, in <module>
  File "/usr/local/lib/python2.7/dist-packages/uff/bin/convert_to_uff.py", line 104, in main
  File "/usr/local/lib/python2.7/dist-packages/uff/converters/tensorflow/conversion_helpers.py", line 103, in from_tensorflow_frozen_model
    return from_tensorflow(graphdef, output_nodes, **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/uff/converters/tensorflow/conversion_helpers.py", line 55, in from_tensorflow
    name, new_name, dtype, shape = name_data.split(',', 3)
ValueError: need more than 1 value to unpack

3. solution: I just removed the -I option, then it goes further.


Did you manage to use your converted model and if you did, what did you put use as your input_node. Thanks!

Nevermind, I figured it out. You have to put -I name_of_node,data_type,dim1,dim2,…

Now I see convert_to_uff tool need set input node as -I name_of_node,data_type,dim1,dim2,… format, how about multiply input and output node?