How can I apply the techniques such as temporal-loss weighting and time marching

I want to try the tricks written in the document below, but they seem not used
in the sample code of 1d wave equation.

<1D Wave Equation>
https://docs.nvidia.com/deeplearning/modulus/user_guide/foundational/1d_wave_equation.html

For temporal-loss weighting, maybe, I can handle it by rewriting the lambda_weighting as follows;

interior

interior = PointwiseInteriorConstraint(
    nodes=nodes,
    geometry=geo,
    outvar={"wave_equation": 0},
    batch_size=cfg.batch_size.interior,
    bounds={x: (0, L)},
    lambda_weighting={"wave_equation": Ct*(1-t_symbol/Te)+1}, <---- this part
    param_ranges=time_range,
)
domain.add_constraint(interior, "interior")

But, for time marching trick, I have no idea how to use this because I need to use the current iteration number during training but
I don’t know how to reference it.
If you have a sample code, it would be much appreciated.

Thanks in advance.

1 Like

Hi @yusuke.takara Have you managed to implement this?
thank you

Hi there,
We dont have an example for this, but this is common trick used in many papers.

This isnt a tested solution, but one way you could try is by writing a custom Node that has input keys t_0 (say this is parameterized between 0-1 for the scaled time range and is in your constraint parameterization) and an output key t.

The evaluate function can be a small torch.nn.Module that maybe looks something like this:

class TimeMarcher(torch.nn.Module):

    def __init__(self):
        super.__init__()
        self.iter = 0
       
   def forward(self, inputs):
          t_max = min([1, 2*self.iter/max_iter]) # Find max time marched range
          self.iter += 1
         return {"t": inputs["t_0"]*t_max}

Just an idea, good luck with your experimenting. There is also a moving time window approach we have an example on for an alternative method of temporal learning.