Int8 Calibration failing when one layer's output is uniformly zero

Hi,

I am training a custom network which is composed of residual connections (like in resnet block). The problem is in some case, the weights of one path will be eventually all zeros during the training and so the resnet block becomes the identity mapping. The trained network still works well with these zero-paths in float32 but fails in the int8 calibration. I have acknowledged that TensorRT is not allowed to do the calibration for layer with all output zero and it will raise this error:

[TensorRT] ERROR: Tensor ... is uniformly zero; network calibration failed.

In this case I cannot do nothing except to retrain another model and hope that there will not be any identity mapping in the network. Is there anyway that I can surpass this problem ?

Hello,

Per engineering, Is it sufficient to manually specify the dynamic range for the tensor?

It’s an unusual circumstance to push a zero-weight convolution through an inference network - unusual enough that we don’t check for this and prune it. Recommend you prune it.

Hi,

I have the same problem. For me, the biases of a layer are trained to zero (which is not completely unusual). I tried manually specifying the dynamic range of only that tensor but I still get the error. Is it possible the calibrator ignores this and tries to calibrate it itself. Is there a way to exclude that tensor from the calibration process. Am I not setting the dynamic range in at the correct stage and the setting is being overwritten somehow?

Fan

Hi,

I also tried to set the layer(s) related to the zero tensors to fp16 precision. The calibrator is still stubbornly trying to calibrate it and throwing the “Tensor is uniformly zero” error.

Fan