How does tegrastats measures average power consumption?

I am comparing two algorithms in terms of their computational burden / power consumption. To do so, I’ve used tegrastats tool which provides current/average cpu power consumption.

My question is, how does tegrastats measure the average power consumption?

The algorithms I am testing do not run at all time. For example, it runs every second for about 0.1 second, and it is idle for the rest ~0.9 seconds. Therefore, it is possible that If I run tegrastats with report frequency 1sec, it is possible that it profiles the current power consumption only when it is idle.

My question is, does average power consumption data also account for the time tegrastat is not reporting? Thank you very much.

hello jaywoaah,

you may refer to documentation for more details, Tegrastats Utility.
please enable --interval options to configure data gather time in milliseconds.
thanks