Write and Read a UFF model

Hello,
I am trying to convert my protobuf file to a tensorRT engine.
Firstly, I convert it to a UFF model and then use this model to convert it to an engine using the below script

import uff
import tensorrt as trt
from tensorrt.parsers import uffparser


frozen_graph_filename = './model.pb'
output_filename = './test.engine'
output_names = ['Softmax']
input_name = 'imgs'
input_height = 224
input_width = 448
input_channels = 3

uff_model = uff.from_tensorflow(frozen_graph_filename, output_names)

G_LOGGER = trt.infer.ConsoleLogger(trt.infer.LogSeverity.ERROR)

parser = uffparser.create_uff_parser()
parser.register_input(input_name, (input_channels , input_height, input_width), 0)
parser.register_output(output_names[0])

engine = trt.utils.uff_to_trt_engine(G_LOGGER, uff_model, parser, 1, 1 << 26)
trt.utils.write_engine_to_file(output_filename, engine.serialize())
engine.destroy()

Now my question is, is that I just want to generate the UFF model, but create the engine later or on another machine.
So how can I save my UFF model directly to the disk ? then how I can read it again and pass it to

trt.utils.uff_to_trt_engine

?

Hello,

You can save a UFF model using uff.from_tensorflow() :

uff.from_tensorflow(graphdef=frozen_graph,
                    output_filename=UFF_OUTPUT_FILENAME,
                    output_nodes=OUTPUT_NAMES,
                    text=True)

You can load UFF using the uffparser api:
https://docs.nvidia.com/deeplearning/sdk/tensorrt-api/python_api/parsers/Uff/pyUff.html