Methods for Freezing Neural Network Weights (Transfer Learning)

Hello. I am training a model that integrates two neural networks. Currently, one of these two neural networks has been pre-trained, and I have prepared its weights. I would like to perform transfer learning using these weights. Please advise me on how to freeze the weights of one neural network and continue training.

Here is an example of my code.
I want to freeze the weight of “net2” for training.

net1 = instantiate_arch(
        input_keys=[Key("x"), Key("y"),Key("t")],
        output_keys=[Key("u")],
        cfg=cfg.arch.fully_connected,
    )

# Freeze the weights of ”net2” network
net2 = instantiate_arch(
        input_keys=[Key("x"), Key("y")],
        output_keys=[Key("k")],
        cfg=cfg.arch.fully_connected,
    )

nodes = (
         [
            net1.make_node(name="network1"),
            net2.make_node(name="network2"),
            ]
    )

Have a look at the conjugate heat transfer example in the documentation: Conjugate Heat Transfer - NVIDIA Docs

You can add optimize=false when creating the nodes like this:
net1.make_node(name=“network1”, optimize=false)

Thank you for your help, my issue has been successfully resolved.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.