error during deserializing deconvolution layer with INT8

I have got this error when deserializing my INT8 plan file.

I tried to find cudnnSerializationUtils.h but it does not exist.

cudnnSerializationUtils.h:67: const T& nvinfer1::cudnn::extractBlobs(const ifb::LayerParams&) [with T = ifb::CaskDeconvolutionBlobs; ifb::BlobsUnion tag = (ifb::BlobsUnion)5u]: Assertion `x == tag’ failed.

my Deconovolution parameter is
(Caffe)
kernel_size : 2
stride : 2
pad : 0
bias true

Environment :
NVIDIA drive px2(AutoChauffeur)
DGPU
cuda 9.0
cudnn 7.1.2
tensorrt 4.1.1(?)

Thanks.

Hello,

can you try TensorRT 5RC? and see if this problem still persists?

@NVES

Thanks for reply.

I implemented custom deconvolution plugin layer and the error disappeared.
(It is cudnn based so I think the logic is same as the built-in)

But I wonder why inference and int8 converting memory access different.

First I implemented my last layer as custom plugin layer using host memory for some outputs.

But when I convert the model to INT8, that must use device memory so I have two streams, one device memory code for coverting, one host memory code for inference.

Thanks.

how do you deal with this problem?
could you share you code ?
thanks