Problem on exporting Tensor RT engine to file and reimport it.

Hi,
I am trying to save the serialized engine to file, then load the file and deserialize it to recreate the engine. But my code doesn’t work properly. Here is my code:

/* SAVE */

IHostMemory *dataStream = pEngine->serialize();
int dataLen = dataStream->size();
std::string binaryData;
binaryData.resize(dataLen);
memcpy(binaryData.data(), dataStream->data(), dataLen);

/* Then I save the string binaryData to the disk */

/* Load and Deserialize */
std::string binaryData;

/* Read string from disk  */
CHECK_RTN_LOGE_OF_FUNC(pSequentialFile->Read(dataLen, binaryData));

//deserialize
TensorRTLogger logger;
IRuntime* infer = createInferRuntime(logger);

pEngine = infer->deserializeCudaEngine((void*)binaryData.data(), dataLen, nullptr);

when I run the code, Segmentation fault occures, and here is the back trace:
SIGSEGV caught

Backtrace:
0: /lib/x86_64-linux-gnu/libc.so.6 (killpg+0x40) [0x7f42d1716cef]
1: /usr/local/TensorRT-2.1.2/lib/libnvinfer.so.3 (nvinfer1::cudnn::Engine::deserialize(void const*, unsigned long, nvinfer1::IPluginFactory*)+0xce) [0x7f42d3b42b5e]
2: /usr/local/TensorRT-2.1.2/lib/libnvinfer.so.3 (nvinfer1::Runtime::deserializeCudaEngine(void const*, unsigned long, nvinfer1::IPluginFactory*)+0x43) [0x7f42d3b65163]


Please give me advises that what’s wrong with my code.
However, the developer document of Tensor RT 3.0 shows that the SampleMNIST sample explains how to export engine to file and reimport it, but I don’t find such code in the sample’s source code.

problem solved.
I found a bug in my code.

please i face same issue could you share how to serlize engine to disk

+1

@twlightloki ,

Can you please share how you fixed the bug in your code. I have the same bug.

Thanks,
Achyut