TensorRT Int8 Calibration Failed

I am trying to quantize a ResNet based pose estimate model. The input of the model is warped images with zero paddings. The quantized model with int8EntropyCalibrator produce totally wrong outputs, even worse than setAllTensorScales. I can not find any info to debug it. one of the layer activations in the model is almost zeros (99%). I guess the int8EntropyCalibrator estimate a wrong distribution from this. Can you help me?

hi weijunsheng90:

how did you find out the reshape operator caused quantization failure. Iam strugging to solve the tensonRT int8 Calibration. I get the total wrong inference result after calibration with my own network with trt5

Sorry, I made a mistake. The quantization failure has nothing to do with the reshape operator. I still can not quantize a pose estimation model which has been successfully quantized by the NCNN framework.