How to get the tensorrt inference resource usage information?

tensorrt 8.4.

We develop C++ application to run tensorrt inference on AV(autonomous vehicle) with Drive Orin.

We need to output tensorrt inference resource usage information (GPU memory, GPU occupation, etc) to log for monitoring when the applicaiton is running.

Is there any tensorrt API or other API for that?


Dear @tjliupeng,
Tegrastats can show the asked information. You can use nsys as well to check the API trace and identify bottleneck.

Is there any functions to report the resource cost for individual trt engine? Tegrastats just print the whole usage.

No. tegrastats report only cumulative usage.

Welcome to the forum! Please note that this platform is exclusively for developers who are part of the NVIDIA DRIVE™ AGX SDK Developer Program. To post in the forum, please use an account associated with your corporate or university email address. This helps us maintain the integrity of the forum as a platform for verified members of the developer program. Thank you for your cooperation.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.