Why does GR3D freq always change a lot when I use sudo ~tegrastats --interval 5000 to monitor GPU

tegrastats result varies a large range(more than 20% changing), but nvidia-smi gets more steady results with same program.

Can nvidia-smi run on tegra system?

Sorry, I run the same code on x86_64(nvidia-smi) and Xavier(tegrastats).

Should I increase or decrease the interval???

Due to the HW is different I don’t think this compared is valuable.

This might be because of DVFS. Try running jetson_clocks.sh first.

Thanks for reply, I will check for that.

Does this thread help?
https://devtalk.nvidia.com/default/topic/1052188/

After running jetson_clocks.sh, GPU utilization of our project still varies. (when this project running in x86_64 platform, the GPU utilization is stable).

But matrixMul(NVIDIA CUDA sample) is OK, for example 8%~11%.

I will try to disable gdm (sudo systemctl stop gdm), and check it again.

I am not sure how the tool measures the utility on x86 platform, which is not tegrastats.

using nvidia-smi command on x86 platform