Issue while storing/reading model to/from file

I am trying to store serialized tensorrt generated model using C++

IHostMemory* serialized=engine->serialize();

when i print


i get a value in MBs which is fine it means my optimized network is getting serialized. Now I am confused how to store this serialized model to a file and read the model from file for inference?
i also looked the file tensorNet.cpp through link
in this file

std::ostream& gieModelstream

was used.
Can anybody please explain how to properly serialize the model to a file and then deserialize from file for inference?