Issue while storing/reading model to/from file

I am trying to store serialized tensorrt generated model using C++
API

IHostMemory* serialized=engine->serialize();

when i print

serialized->size()

i get a value in MBs which is fine it means my optimized network is getting serialized. Now I am confused how to store this serialized model to a file and read the model from file for inference?
i also looked the file tensorNet.cpp through link
https://github.com/dusty-nv/jetson-inference
in this file

std::ostream& gieModelstream

was used.
Can anybody please explain how to properly serialize the model to a file and then deserialize from file for inference?
Thanks