Higher order derivatives in importance sampling


I am considering several functions to be used in the importance sampling scheme proposed by Modulus. In the lid driven cavity example, the 2-norm of the velocity derivative is used. In case higher-order derivatives are to be needed, how should one proceed? This might be of interest if one wanted to use the residual as a sampling measure (as it might be for the heat equation or Navier-Stokes where a second-order term is found).
In the list of required outputs of the graph, it is possible to specify which key (e.g., T in the heat equation) holds the derivative with respect to another key ( e.g., x ). How do you specify that you want a larger order derivative with respect to that key?


Hi @nripamont

For importance sampling in the LDC example, the first order derivatives are used. Changing to higher-order should be straight forward (assuming the gradients can be calculated in the graph). High-order diffs can be specified in the output keys which will then show up in the output dictionary.

Keys that are derivatives are converted into Keys using the diff_str, so u__x is du/dx = Key('u', diff=[Key('x')]), u__x__x is d2u/dx2 = Key('u', diff=[Key('x'), Key('x')]), u__x__y is d2u/dxdy = Key('u', diff=[Key('x'), Key('y')]), etc.

So if you want to importance sample in the LDC example with second order diffs:

importance_model_graph = Graph(
        invar=[Key("x"), Key("y")],
            Key("u", derivatives=[Key("x"), Key("x")]),

    def importance_measure(invar):
        outvar = importance_model_graph(
            Constraint._set_device(invar, device=device, requires_grad=True)
        importance = (
            outvar["u__x__x"] ** 2
        ) ** 0.5 + 10
        return importance.cpu().detach().numpy()

(You could try going higher to third order if you want, will be much slower but Modulus should do the autodiffs for you)