Error when changing activation function

Hi there,

I would like to change the activation function for my physics-based model (cylinder example). I have used the method described in the manual and used the correct import I found in this forum.

However, I receive the following error:

Error executing job with overrides: []
Traceback (most recent call last):
  File "/software/modulus/22.09/lib/python3.8/site-packages/modulus-22.9-py3.8.egg/modulus/hydra/", line 200, in instantiate_arch
    model, param = model_arch.from_config(model_cfg)
  File "/software/modulus/22.09/lib/python3.8/site-packages/modulus-22.9-py3.8.egg/modulus/models/", line 500, in from_config
    cfg["activation_fn"] = Activation[cfg["activation_fn"]]
  File "/software/modulus/22.09/lib/python3.8/site-packages/modulus-22.9-py3.8.egg/modulus/models/layers/", line 16, in __getitem__
    return super().__getitem__(name.upper())
AttributeError: 'Activation' object has no attribute 'upper'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "", line 85, in run
    flow_net = instantiate_arch(
  File "/software/modulus/22.09/lib/python3.8/site-packages/modulus-22.9-py3.8.egg/modulus/hydra/", line 210, in instantiate_arch
    raise Exception(fail) from e
Exception: Failed to initialize architecture.
 {'input_keys': [x, y], 'output_keys': [u, v, p], 'detach_keys': [???], 'scaling': None, 'layer_size': 512, 'nr_layers': 6, 'skip_connections': False, 'activation_fn': <Activation.SILU: 9>, 'adaptive_activations': False, 'weight_norm': True}

Set the environment variable HYDRA_FULL_ERROR=1 for a complete stack trace.
ERROR:torch.distributed.elastic.multiprocessing.api:failed (exitcode: 1) local_rank: 0 (pid: 121742) of binary: /software/modulus/22.09/bin/python3.8
Traceback (most recent call last):
  File "/software/modulus/22.09/bin/torchrun", line 8, in <module>
  File "/software/modulus/22.09/lib/python3.8/site-packages/torch/distributed/elastic/multiprocessing/errors/", line 346, in wrapper
    return f(*args, **kwargs)
  File "/software/modulus/22.09/lib/python3.8/site-packages/torch/distributed/", line 762, in main
  File "/software/modulus/22.09/lib/python3.8/site-packages/torch/distributed/", line 753, in run
  File "/software/modulus/22.09/lib/python3.8/site-packages/torch/distributed/launcher/", line 132, in __call__
    return launch_agent(self._config, self._entrypoint, list(args))
  File "/software/modulus/22.09/lib/python3.8/site-packages/torch/distributed/launcher/", line 246, in launch_agent
    raise ChildFailedError(

How can I fix this? It happens for different activation functions.

Many thanks in advance.

Hi @jflatter

Which activation functions are you trying? Could you post that portion of your config file?

Hi @patterson

I was trying different activiation functions such as tanh and silu.

I used the code provided here: Modulus Configuration - NVIDIA Docs

I didn’t change my config file, the portion of it is:

defaults :
  - modulus_default
  - arch:
      - fully_connected
  - scheduler: tf_exponential_lr
  - optimizer: adam
  - loss: sum
  - _self_

The Python code in my case is:

    ns = NavierStokes(nu=nu, rho=rho, dim=2, time=False)
    normal_dot_vel = NormalDotVec(["u", "v"])
    flow_net = instantiate_arch(
        input_keys=[Key("x"), Key("y")],
        output_keys=[Key("u"), Key("v"), Key("p")],
    nodes = (
        + normal_dot_vel.make_nodes()
        + [flow_net.make_node(name="flow_network")]

Many thanks in advance.

Hi @jflatter

Looks like this is a bug with the instantiate_arch function which is designed to work with the config files. Here are two solutions that should work for you:

  1. Use a string for this function (activation_fn=“tanh” and activation_fn=“silu”)
  2. Construct your network the typical pythonic way (this will make the config.yaml not fully usable):
flow_net = FullyConnectedArch(
   input_keys=[Key("x"), Key("y")],
   output_keys=[Key("u"), Key("v"), Key("p")],
1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.