Saving a serialized model to disk

Hi there, following the dev guide I managed to get a network parsed from caffe and I serialized it.

Now I want to save that serialized network to disk for latter use in inference, according to the guide it says:

IHostMemory *serializedModel = engine->serialize();
// store model to disk
// <...>
serializedModel->destroy()

But I’m not sure how to perform that ‘store model to disk’ and haven’t found something useful in the examples. Anyone knows how to save the serialized network?

You can use the generic std::ofstream like this

std::ofstream ofs("serialized_engine.trt", std::ios::out | std::ios::binary);
ofs.write((char*)(serialized_model ->data()), serialized_model ->size());
ofs.close();

Thank you, this is exactly what I was looking for.

hello~In accordance with the following code, I have serialized the network to disk successfully.

std::ofstream ofs(“serialized_engine.txt”, std::ios::out | std::ios::binary);
ofs.write((char*)(serialized_model ->data()), serialized_model ->size());
ofs.close();

Now I have a new problem: how to use the “serialized_engine.txt” in inference ? how to load it and transform it to serialized_model ?

So how to read the stored model???