Is there an NVIDIA tool to check the content of the TRT engine?

Hi,

I used both of trtexec and tensorrt Python API (build_cuda_engine()) to build TensorRT engine from ONNX models, but once a generate the TRT engine i can’t see the optimisations and how TensorRT edited the neuronal network before running the inference, i’am wondering if there is a tool which allow to see the optimisations done by TensorRT, if notis there a tool for profiling or to see how TensorRT do the INT8 quantization, deleting output and deleting layers etc…

Thanks

Hi @chakibdace,

We don’t have a such tool. But there is some information in the verbose log when the engine is being built. It shows what fusions TRT applied.

Please checkout trtexc and use trtexec with --verbose option.
https://github.com/NVIDIA/TensorRT/tree/master/samples/opensource/trtexec

Thank you.