Hello @AastaLLL ,
It works now, the solution is to install the latest version of Torch+CUDA support from Nvidia and build TorchVision from source.
However, I have a very bad result on yolov5 INT8 quatified and calibrated engine with EfficientDet scripts here.
Results
So I think the calibration is not well done for yolov5, what do you think?
Question
Could you please, tell me how to do the right INT8 calibration for yolov5 using JPEG/JPG image format from COCO dataset like the EfficientDet scripts do?
NOTE:
I get this warning below when generating the engine (maybe this could help you)
I found this thread here as well maybe it is not the calibration but TensorRT!
[TRT] [W] - Subnormal FP16 values detected.
[TRT] [W] If this is not the desired behavior, please modify the weights or retrain with regularization to reduce the magnitude of the weights.
[TRT] [W] Weights [name=Conv_195 + PWN(PWN(Sigmoid_196), Mul_197).weight] had the following issues when converted to FP16:
[TRT] [W] - Subnormal FP16 values detected.
[TRT] [W] If this is not the desired behavior, please modify the weights or retrain with regularization to reduce the magnitude of the weights.
[TRT] [W] Weights [name=Conv_195 + PWN(PWN(Sigmoid_196), Mul_197).weight] had the following issues when converted to FP16:
[TRT] [W] - Subnormal FP16 values detected.
[TRT] [W] If this is not the desired behavior, please modify the weights or retrain with regularization to reduce the magnitude of the weights.
[TRT] [W] Weights [name=Conv_195 + PWN(PWN(Sigmoid_196), Mul_197).weight] had the following issues when converted to FP16:
[TRT] [W] - Subnormal FP16 values detected.
[TRT] [W] If this is not the desired behavior, please modify the weights or retrain with regularization to reduce the magnitude of the weights.
[TRT] [W] Weights [name=Conv_198.weight] had the following issues when converted to FP16:
[TRT] [W] - Subnormal FP16 values detected.
[TRT] [W] If this is not the desired behavior, please modify the weights or retrain with regularization to reduce the magnitude of the weights.
[TRT] [W] Weights [name=Conv_198.weight] had the following issues when converted to FP16:
Thank you very much for your help @AastaLLL :)
Harry