Hi there,
I met a problem when I tried to convert an Inception-V3 model to uff. Here is part of my code:
Convert a tensorflow model to UFF
uff_model =uff.from_tensorflow_frozen_model("./inception_v3.pb",["InceptionV3/Logits/SpatialSqueeze"])
#Import a UFF Model into TensorRT and Create an Engine
G_LOGGER = trt.infer.ConsoleLogger(trt.infer.LogSeverity.ERROR)
parser = uffparser.create_uff_parser()
parser.register_input("input", (1,224,224), 0)
parser.register_output("InceptionV3/Logits/SpatialSqueeze")
engine = trt.utils.uff_to_trt_engine(G_LOGGER, uff_model, parser, 128, 1 << 10)
And the error is below:
Using output node InceptionV3/Logits/SpatialSqueeze
Converting to UFF graph
No. nodes: 788
[TensorRT] ERROR: InceptionV3/InceptionV3/Conv2d_1a_3x3/Conv2D: kernel weights has count 864 but 288 was expected
[TensorRT] ERROR: UFFParser: Parser error: InceptionV3/InceptionV3/Conv2d_1a_3x3/BatchNorm/FusedBatchNorm: Invalid scale mode, nbWeights: 32
[TensorRT] ERROR: Failed to parse UFF model stream
File "/usr/lib64/python2.7/site-packages/tensorrt/utils/_utils.py", line 191, in uff_to_trt_engine
assert(parser.parse(stream, network, model_datatype))
Traceback (most recent call last):
File "tensorrt_gen.py", line 28, in <module>
engine = trt.utils.uff_to_trt_engine(G_LOGGER, uff_model, parser, 128, 1 << 10)
File "/usr/lib64/python2.7/site-packages/tensorrt/utils/_utils.py", line 199, in uff_to_trt_engine
raise AssertionError('UFF parsing failed on line {} in statement {}'.format(line, text))
AssertionError: UFF parsing failed on line 191 in statement assert(parser.parse(stream, network, model_datatype))
Any help would be appreciated.