How to show every layer inference time in tensorRT?

How to display every layer inference time in tensorRT?

I just read Best Practices For TensorRT Performance :: NVIDIA Deep Learning TensorRT Documentation and the link can not tell me how to display layers inference time,it’s just record the time in the example.

Hi,
Request you to share the model, script, profiler and performance output if not shared already so that we can help you better.
Alternatively, you can try running your model with trtexec command.

While measuring the model performance, make sure you consider the latency and throughput of the network inference, excluding the data pre and post-processing overhead.
Please refer below link for more details:
https://docs.nvidia.com/deeplearning/tensorrt/archives/tensorrt-722/best-practices/index.html#measure-performance

Thanks!