ChatGPT解决这个技术问题 Extra ChatGPT

PyTorch: How to change the learning rate of an optimizer at any given moment (no LR schedule)

Is it possible in PyTorch to change the learning rate of the optimizer in the middle of training dynamically (I don't want to define a learning rate schedule beforehand)?

So let's say I have an optimizer:

optim = torch.optim.SGD(model.parameters(), lr=0.01)

Now due to some tests which I perform during training, I realize my learning rate is too high so I want to change it to say 0.001. There doesn't seem to be a method optim.set_lr(0.001) but is there some way to do this?


p
patapouf_ai

So the learning rate is stored in optim.param_groups[i]['lr']. optim.param_groups is a list of the different weight groups which can have different learning rates. Thus, simply doing:

for g in optim.param_groups:
    g['lr'] = 0.001

will do the trick.

Alternatively,

as mentionned in the comments, if your learning rate only depends on the epoch number, you can use a learning rate scheduler.

For example (modified example from the doc):

torch.optim.lr_scheduler import LambdaLR
optimizer = torch.optim.SGD(model.parameters(), lr=0.1, momentum=0.9)
# Assuming optimizer has two groups.
lambda_group1 = lambda epoch: epoch // 30
lambda_group2 = lambda epoch: 0.95 ** epoch
scheduler = LambdaLR(optimizer, lr_lambda=[lambda1, lambda2])
for epoch in range(100):
    train(...)
    validate(...)
    scheduler.step()

Also, there is a prebuilt learning rate scheduler to reduce on plateaus.


@MehmetBurakSayıcı ask a new question.
i tried to use this but i didn't change the lr. had to make the change like this: for i in range(len(optimizer.param_groups)): optimizer.param_groups[i]['lr'] = new_lr
d
desertnaut

Instead of a loop in patapouf_ai's answer, you can do it directly via:

optim.param_groups[0]['lr'] = 0.001

This only works if you have a single parameter group. (Which granted is probably most of the time.)