Problem About save the serialized model of tensor rt model.

I try use the sample named sampleONNXMNIST on window10 ,vs2015,cuda10,TensorRT-5.1.5.0,cudnn v7.5.0.56,it works well.
I want to save the model as follow:
IHostMemory* trtModelStream{ nullptr };
onnxToTRTModel(“mnist.onnx”, 1, trtModelStream);
system(“PAUSE”);
assert(trtModelStream != nullptr);
assert(runtime != nullptr);
if (gUseDLACore >= 0)/*
{runtime->setDLACore(gUseDLACore);}
std::ofstream output(“mnist.engine”, std::ios::binary | std::ios::out | std::ios::app);
output.write(reinterpret_cast<const char*>(trtModelStream->data()), trtModelStream->size());
output.close();
load the model as follow:
std::ifstream cache(“mnist.engine”);
std::stringstream gieModelStream;
gieModelStream.seekg(0, gieModelStream.beg);
gieModelStream << cache.rdbuf();
cache.close();
gieModelStream.seekg(0, std::ios::end);
const int modelSize = gieModelStream.tellg();
gieModelStream.seekg(0, std::ios::beg);
void* modelMem = malloc(modelSize);
gieModelStream.read((char*)modelMem, modelSize);
ICudaEngine* engine = runtime->deserializeCudaEngine(modelMem, modelSize, nullptr);
I got the wrong result,and it says:

[E] [TRT] The engine plan file is not compatible with this version of TensorRT, expecting library version 5.1.5 got 0.0.0, please rebuild.

What is the reason…?

Hi,

I have a similar problem when deserialising models. Almost the same code works on linux, but I’m getting the same error on Windows:

“The engine plan file is not compatible with this version of TensorRT, expecting library version 5.1.5 got 0.0.0, please rebuild”

Did you eventually solve the issue, or found out why this wasn’t working?

Regards