UFF Parsing Failed

Trying to build a TensorRT engine for a custom network. I am able to create an uff file from the frozen graph. However the uff_to_trt_engine call fails with a very ambiguous error:

AssertionError: UFF parsing failed on line 186 in statement assert(parser_result)

Considering I was able to build the uff file, what might be the reason it then fails to build the engine? The relevant code:

tf_path = '/home/mmibm/networks/predict_frozen_graph.pb'
uff_model = uff.from_tensorflow_frozen_model(tf_path, ["energy_estimator/e_upscore_1/conv2d_transpose"], text=True,

G_LOGGER = trt.infer.ConsoleLogger(trt.infer.LogSeverity.ERROR)
parser = trt.parsers.uffparser.create_uff_parser()
parser.register_input("image_input", (3, 384, 512), 0)

engine = trt.utils.uff_to_trt_engine(G_LOGGER, uff_model, parser, 1, 1 << 30, trt.infer.DataType.FLOAT)

You can find the uff file here:


Update: Solved here, https://devtalk.nvidia.com/default/topic/1027424/jetson-tx2/incorrect-results-during-inference-using-tensorrt3-0-c-uff-parser/post/5230680/#5230680

Were you able to resolve this?

Hi @ian.bell87 have you solved this error?
if you have resolved it please share the solution.

File “/home/wang/PycharmProjects/Test_VGG_Tensorrt/create_engine.py”, line 44, in
File “/home/wang/PycharmProjects/Test_VGG_Tensorrt/create_engine.py”, line 29, in create_and_save_inference_engine
File “/usr/local/lib/python3.5/dist-packages/tensorrt/utils/_utils.py”, line 263, in uff_to_trt_engine
raise AssertionError(‘UFF parsing failed on line {} in statement {}’.format(line, text))
AssertionError: UFF parsing failed on line 255 in statement assert(parser.parse(stream, network, model_datatype))

Hey @p146103, did you manage to figure it out, I am stuck on the exact same problem and can’t seem to have any resources to go by