I managed to get a network parsed from caffe and I serialized it.
According to the following program, I saved the serialized result to “serialized_engine.txt” file successfully .
IHostMemory serializedModel = engine->serialize();
// store model to disk
std::ofstream ofs(“serialized_engine.txt”, std::ios::out | std::ios::binary);
ofs.write((char)(serialized_model ->data()), serialized_model ->size());
But I don’t know how to use the “serialized_engine.txt” and haven’t found something useful in the examples.
Anyone knows how to transform the “serialized_engine.txt” to the serializedModel in inference ?