How to use Learning Rate Annealing etc in Modulus


I read in the advanced schemes:

that they are several Learning Rate Annealing mtd etc like:

Homoscedastic Task Uncertainty for Loss Weighting

However, there’s only an example for Neural Tangent Kernel (NTK).

So are the other mtds available in Modulus too? If so, are there examples of how to use them?


You set it in the config.yaml, e.g.

defaults :
  - modulus_default
  - arch:
      - modified_fourier
  - scheduler: tf_exponential_lr
  - optimizer: adam
  - loss: lr_annealing
  - _self_

Hi npstrike,

Ok thanks! I’ll try it.