Hi all,
I was trying for creating an engine from sample code
sampleUffMNIST
TensorRT-5.0.4.3
Visual Studio 2017
windows 10
I have used the code snippets from GitHub - dusty-nv/jetson-inference: Hello AI World guide to deploying deep-learning inference networks and deep vision primitives with TensorRT and NVIDIA Jetson. as suggested in Loading of the tensorRT Engine in C++ API - Jetson TX1 - NVIDIA Developer Forums
However the engine was successfully created.
// This is the code of writing engines////
IHostMemory* trtModelStreamtry = engine->serialize();
std::stringstream gieModelStream;
gieModelStream.seekg(0, gieModelStream.beg);
gieModelStream.write((const char*)trtModelStreamtry->data(), trtModelStreamtry->size());
std::ofstream outFile;
outFile.open("outtest.engine");
outFile << gieModelStream.rdbuf();
outFile.close();
But I tried reading and restoring the engine using the following code
std::vector<char> trtModelStream_;
size_t size{ 0 };
std::ifstream file("outtest.engine", std::ios::binary);
if (file.good())
{
file.seekg(0, file.end);
size = file.tellg();
file.seekg(0, file.beg);
trtModelStream_.resize(size);
std::cout << "size" << trtModelStream_.size() << std::endl;
file.read(trtModelStream_.data(), size);
file.close();
}
std::cout << "size" << size;
IRuntime* runtime = createInferRuntime(gLogger);
assert(runtime != nullptr);
ICudaEngine* engine = runtime->deserializeCudaEngine(trtModelStream_.data(), size, nullptr);
It throws Exception thrown at 0x00007FFD2336E8B0 (nvinfer.dll) in sample_uff_mnist.exe: 0xC0000005: Access violation reading location 0x0000019563E06FA0.
during runtime->deserializeCudaEngine
Is engine creation and usage supported in windows ???