I would like to convert pb file which is in ssd_mobilenet_v1_quantized_300x300_coco14_sync to uff file. So, I tried use convert_to_uff.py. But, I could not do this work.
$python3 /usr/lib/python3.6/dist-packages/uff/bin/convert_to_uff.py tflite_graph.pb -o ssd_mobilenet_v1.uff -O NMS -p config.py
My config.py is config.py.txt (2.0 KB)
The error message are following:
2020-11-22 17:40:44.038248: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library libcudart.so.10.2
WARNING:tensorflow:Deprecation warnings have been disabled. Set TF_ENABLE_DEPRECATION_WARNINGS=1 to re-enable them.
Loading tflite_graph.pb
WARNING:tensorflow:From /usr/lib/python3.6/dist-packages/uff/bin/../../uff/converters/tensorflow/conversion_helpers.py:274: The name tf.gfile.GFile is deprecated. Please use tf.io.gfile.GFile instead.
NOTE: UFF has been tested with TensorFlow 1.15.0.
WARNING: The version of TensorFlow installed on this system is not guaranteed to work with UFF.
Traceback (most recent call last):
File "/usr/lib/python3.6/dist-packages/uff/bin/convert_to_uff.py", line 143, in <module>
main()
File "/usr/lib/python3.6/dist-packages/uff/bin/convert_to_uff.py", line 139, in main
debug_mode=args.debug
File "/usr/lib/python3.6/dist-packages/uff/bin/../../uff/converters/tensorflow/conversion_helpers.py", line 276, in from_tensorflow_frozen_model
return from_tensorflow(graphdef, output_nodes, preprocessor, **kwargs)
File "/usr/lib/python3.6/dist-packages/uff/bin/../../uff/converters/tensorflow/conversion_helpers.py", line 153, in from_tensorflow
pre.preprocess(dynamic_graph)
File "/home/totti/09.DeepStreamTotti/ssd_mobilenet_v1_quantized_300x300_coco14_sync_2018_07_18/config.py", line 56, in preprocess
dynamic_graph.find_nodes_by_op("NMS_TRT")[0].input.remove("Input")
File "/usr/local/lib/python3.6/dist-packages/google/protobuf/internal/containers.py", line 296, in remove
self._values.remove(elem)
ValueError: list.remove(x): x not in list
How can I convert pb file to uff file? Thank you for your support.