Recently, I started to use DeepStream for accelerating the inference speed. However, I have tried to check the whole SDK and sample. I could not fine any document describing how to generate the int-8-calib-file as in Yolo example of yolov3-calibration.table.trt5.1 in DeepStream. Does anyone have the experience to implement the calibration file with python before for DeepStream ? Thanks.
Since DS wraps TRT, this is TRT question.
You can refer to TRT sample - samples/python/int8_caffe_mnist