TensorRT 4 int8 deserialize issue

I successfully build the engine with int8 mode, but when the program deserialize the gieModelstream

ICudaEngine* engine = runtime->deserializeCudaEngine(gieModelStream->data, gieModelStream->size(), nullptr);

It shows the following

cudnnSerializationUtils.h:67: const T& nvinfer1::cudnn::extractBlobs(const ifb::LayerParams&) [with T = ifb::CaskDeconvolutionBlobs; ifb::BlobsUnion tag = (ifb::BlobsUnion)5u]: Assertion `x == tag’ failed.
))

Does it means that something wrong in the process of calibration?
What’s the cause of the failure? unsupported layer?
If the layer is the problem, why the calibration was done and the parser did not give any errors?

Most important, sometimes the program can deserialize SUCCESSFULLY and do inference properly and sometimes give the failure above.