Hi! When profiling scientific computation (a cryptographic algorithm) I see some untypical behaviour around extreme clock frequencies. More precisely, running time behaves stepwise at lowest frequencies, while estimated energy has high variation at highest-frequencies. This behaviour is consistent across devices and algorithm inputs (see the plot). I am looking for some explanation/reference what are those phenomenas driven by?