Hello
While training a Neural Network model on a Jetson Orin module, I observed that the energy consumption during the first epoch is noticeably higher compared to subsequent epochs. Here’s in below table from my training logs, indicating Time_total and Energy_total:
Epoch | Time_total | Energy_total |
---|---|---|
0 | 0.1355 | 1767414.685 |
1 | 0.0161 | 124208.9431 |
2 | 0.0158 | 119931.8558 |
3 | 0.0152 | 115094.0329 |
This pattern emerges despite a constant model architecture and training methodology. I’m using Python, PyTorch, and operating on a Linux Ubuntu system.
Could the community shed light on why this might be happening? Is it a common occurrence or specific to neural networks? Any insights would be greatly appreciated.