Hello
I am trying to train a model using TAO. In the documentation, I see that there are 3 optimizers that we can configure, but I do not see any info on which parameters to configure for the different optimizers.
The example document only has this.
# Only ['sgd', 'adam'] are supported for optimizer
optimizer {
sgd {
lr: 0.01
decay: 0.0
momentum: 0.9
nesterov: False
}
}
Can someone please tell which parameters need to be set to use ADAM?
Sorry, when I try to run the config mentioned above, I am getting an error like: google.protobuf.text_format.ParseError: 33:7 : Message type "AdamOptimizerConfig" has no field named "beta1"