Learning Rate Schedules

Cosine Annealing

Introduced by Loshchilov et al. in SGDR: Stochastic Gradient Descent with Warm Restarts

Cosine Annealing is a type of learning rate schedule that has the effect of starting with a large learning rate that is relatively rapidly decreased to a minimum value before being increased rapidly again. The resetting of the learning rate acts like a simulated restart of the learning process and the re-use of good weights as the starting point of the restart is referred to as a "warm restart" in contrast to a "cold restart" where a new set of small random numbers may be used as a starting point.

$$\eta_{t} = \eta_{min}^{i} + \frac{1}{2}\left(\eta_{max}^{i}-\eta_{min}^{i}\right)\left(1+\cos\left(\frac{T_{cur}}{T_{i}}\pi\right)\right) $$

Where where $\eta_{min}^{i}$ and $ \eta_{max}^{i}$ are ranges for the learning rate, and $T_{cur}$ account for how many epochs have been performed since the last restart.

Text Source: Jason Brownlee

Image Source: Gao Huang

Source: SGDR: Stochastic Gradient Descent with Warm Restarts

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Language Modelling 83 10.85%
Large Language Model 50 6.54%
Question Answering 49 6.41%
Retrieval 27 3.53%
Text Generation 26 3.40%
In-Context Learning 24 3.14%
Sentence 23 3.01%
Prompt Engineering 22 2.88%
Code Generation 18 2.35%

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories