CosineAnnealingLR¶
- class torch.optim.lr_scheduler.CosineAnnealingLR(optimizer, T_max, eta_min=0.0, last_epoch=-1, verbose='deprecated')[source][source]¶
使用余弦退火计划设置每个参数组的学习率。
被设置为初始学习率, 是 SGDR 中自上次重启以来的 epoch 数
当 last_epoch=-1 时,将初始学习率设置为 lr。请注意,由于该 schedule 是递归定义的,学习率可以同时被此 scheduler 之外的其他操作符修改。如果学习率仅由此 scheduler 设置,则每个步骤的学习率变为
It has been proposed in SGDR: Stochastic Gradient Descent with Warm Restarts. Note that this only implements the cosine annealing part of SGDR, and not the restarts.
- Parameters
optimizer (Optimizer) – Wrapped optimizer.
T_max (int) – Maximum number of iterations.
eta_min (float) – Minimum learning rate. Default: 0.
last_epoch (int) – The index of last epoch. Default: -1.
If
True
, prints a message to stdout for each update. Default:False
.Deprecated since version 2.2:
verbose
is deprecated. Please useget_last_lr()
to access the learning rate.
- load_state_dict(state_dict)[source]¶
Load the scheduler’s state.
- Parameters
state_dict (dict) – scheduler state. Should be an object returned from a call to
state_dict()
.
- print_lr(is_verbose, group, lr, epoch=None)[source]¶
Display the current learning rate.
Deprecated since version 2.4:
print_lr()
is deprecated. Please useget_last_lr()
to access the learning rate.