I am not sure if there is a power bug but still worth discussing here. I have a workload that uses both the cpu and gpu.
I run a workload at the maximum cpu frequency and maximum gpu frequency. My power consumption is about 9.2 watts. I reduce the gpu frequency by 7 steps (there are 13 options). My power consumption is still the same. This is very weird to me because technically there should be a significant reduction in power operating at the maximum frequency vs operating 7 steps below it.
Hi,
I just use by writing the possible frequencies to the files in cat/sys/kernel/gbus/rate. By 7 steps i meant the 7th possible frequency out of the 13 ones. I think the reason I don’t see any power change is probably because even at the max frequency the utilization is not that much and therefore there is no power change when comparing it to a lower frequency. Is that a valid reason?
Thank you
You could check the Tegrastat log during each steps on running the workload to know the CPU and GPU usage as reference to judge whether no power change is reasonable.