How can I acquire Int8 model from tensorrt Int8 Engine?

How can I acquire Int8 model from tensorrt Int8 Engine?

Thanks for helping!

Hi,
You can serialize you TRT engine file and use at a later time for inference.

Please refer to below link for more details:
https://docs.nvidia.com/deeplearning/sdk/tensorrt-archived/tensorrt-601/tensorrt-developer-guide/index.html#serial_model_c
https://docs.nvidia.com/deeplearning/sdk/tensorrt-archived/tensorrt-601/tensorrt-developer-guide/index.html#serial_model_python

Thanks

I want to acquire parameters of CNN kernel weight that are quantized, after serializing and saving the model, how can I read the serialized model and acquire parameters?