Loading of the tensorRT Engine in C++ API

Is it possible to load a serialized engine made by TensorRT 4.0 to TensorRT 5.0?

I am getting the following error:

deserializeCudaEngine::51, condition: (blob) != nullptr


It won’t work.

TensorRT PLAN has dependencies on the package version and GPU architecture.
It cannot be used cross-platform or cross-version.


I have the same error. Just segmentation fault without any other information. Any suggestions ? I don’t understand the same “typecasting”.

Thanks in advance.

Hi @AastaLLL,

Is there any method which let me check that engine plan file is generated on an incompatible device?

I want to avoid exception before deserializeCudaEngine function return error message.