INT8 Quantization: how to generate and read calibration table?

Hello everyone,

I am using TensorRT in order to quantize a DNN for object detection called “Pixor”.
I am using Python3 + Tensorflow 1.12 + TensorRT 3.0.2

The quantization work fine for me. However, I want to generate and read the calibration table in order to understand if my calibration dataset is good enough or not.

I want to ask also if i can generate the histograms of activation as shown in these slides?
http://on-demand.gputechconf.com/gtc/2017/presentation/s7310-8-bit-inference-with-tensorrt.pdf

Thank you in advance,
Fares

Unfortunately, we have not made public the details of the calibration table. We are always listening to community feedback and please stay tuned for future announcements.

Hello. Thanks for your answer,

However, I found the following python code in NVIDIA tutorial to extract the TensorRT calibration table after the calibration is done:

for n in trt_graph.node:
if n.op == “TRTEngineOp”:
print(“Node: %s, %s” % (n.op, n.name.replace("/", “")))
with tf.gfile.GFile("%s.calib_table" % (n.name.replace("/", "
”)), ‘wb’) as f:
f.write(n.attr[“calibration_data”].s)

This code generates a file for each node in the trt_graph. However, I can’t read it!
I get something like this:

Up_sample_6/conv2d_25/Relu: 3f556f06
InputPH_1: 3e0324b6
Up_sample_6/conv2d_26/Relu: 3eee7366
OutputPH_0: 3e59daba
Up_sample_6/conv2d_27/Relu: 3dc77c91
InputPH_0: 3f41b751

Can you please explain how to read these files?