Modulus Fourcastnet PyTorch Learning rate scheduler warning

@ngeneva @tbednarz
PyTorch warning:-

…/lib/python3.9/site-packages/torch/optim/lr_scheduler.py:138: UserWarning: Detected call of lr_scheduler.step() before optimizer.step(). In PyTorch 1.1.0 and later, you should call them in the opposite order: optimizer.step() before lr_scheduler.step(). Failure to do this will result in PyTorch skipping the first value of the learning rate schedule. See more details at torch.optim — PyTorch 1.13 documentation
warnings.warn("Detected call of lr_scheduler.step() before optimizer.step(). "

1 Like

Hi @john.taylor1

Thanks for bringing this to our attention. I’ll get it added to our backlog.

This warning message is important to keep in mind for users of PyTorch 1.1.0 and later. It advises to call optimizer.step() before lr_scheduler.step() to avoid PyTorch skipping the first value of the learning rate schedule. Following this recommendation will ensure that the learning rate is updated correctly during training, resulting in better model performance. It’s always helpful when software packages provide such warnings to help users avoid potential issues.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.