Utilizing Optimizer Learn Rate

I am looking to train a model using only the ‘adam’ or ‘sgd’ optimizer for learn rate regulation on a FasterRCNN model. It says in the documentation that if I have a learn rate scheduler it will override the optimizer learn rate. However, if I remove the learn rate scheduler to utilize the optimizer LR I get an error saying I cannot run without a learn rate scheduler. How do I run with only the optimizer LR without the scheduler?

In frcnn training, the learning rate scheduler should not be deleted. To select a learning rate policy, we should use learning rate scheduler. Learning rate in Adam is a constant value and cannot change during the training, which is not suitable. LR scheduler is more flexible than LR initial value set in optimizer.