Can not load tensor RT Model

Loading Tensor RT Model by nvinfer plugin works well as below.

[config.file]

model-engine-file=MODEL_PATH

But When I load the model by c++ code, It doesn’t work.

Segmentation fault error occurs in “deserializeCudaEngine”.

How can i do?

[c++ code]

ifstream cache(MODEL_PATH);
if (!cache)
    cout << "file no exist error" << endl;
stringstream gieModelStream;

gieModelStream.seekg(0, gieModelStream.beg);
gieModelStream << cache.rdbuf();
cache.close();

infer = nvinfer1::createInferRuntime(marvinpose->gLogger);

gieModelStream.seekg(0, std::ios::end);
const int modelSize = gieModelStream.tellg();
gieModelStream.seekg(0, std::ios::beg);
void *modelMem = malloc(modelSize);
gieModelStream.read((char*)modelMem, modelSize);

engine = infer->deserializeCudaEngine(modelMem, modelSize, NULL);
free(modelMem);

Hi,

Could you share the error message with us?
Thanks.

Hi,

There’s no error message.

Only shows segmentation fault.

Is there anything else I should do?

I think nvinfer plugin also load like this.

Thanks.

Hi,

First, could you help to check if modelMem is valid or not.
Just print out the value and make sure it is not empty.

Then, could you help to enable more logging information of gLogger,

class Logger : public nvinfer1::ILogger
{
    void log(Severity severity, const char* msg) override
    {
        if( severity != Severity::kINFO )
            std::cout << msg << std::endl;
    }
} gLogger;

...
infer = nvinfer1::createInferRuntime(marvinpose->gLogger);

This should give you much more nvinfer logger and please share the message with us.
Thanks.