INT8 calibration table for Yolov2

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU): Telsa T4
• DeepStream Version: 5.0
• JetPack Version (valid for Jetson only)
• TensorRT Version : 7.0
• NVIDIA GPU Driver Version (valid for GPU only): 440.64.00

DS 5.0 releases the int8 calib table for yolov3 but not yolov2.

With DS4, it is possible to generate a calib table for yolov2 and run it in int8 – below link was used.

But for DS5, which uses TRT 7 – there are some changes and the above generated calib table is not working.

Is there any INT8 calibration table for Yolov2 available for use?

Or how can I generate it with Tensorrt 7?

Hi @andy.linluo
We don’t provide INT8 calibration table for YoloV2.
Will you use YoloV2 in your product?

For now, to generate INT8 calibration, you need to add TRT INT8 calibration in /opt/nvidia/deepstream/deepstream-5.0/sources/objectDetector_Yolo/nvdsinfer_custom_impl_Yolo/yolo.cpp by referring to the calibrtion code in TRT INT8 sample - ${tensorrt_package}/samples/sampleINT8