Learning Rate Schedules

Linear Warmup With Linear Decay

Linear Warmup With Linear Decay is a learning rate schedule in which we increase the learning rate linearly for $n$ updates and then linearly decay afterwards.


Paper Code Results Date Stars


Task Papers Share
Language Modelling 85 13.30%
Text Classification 25 3.91%
Question Answering 25 3.91%
Sentiment Analysis 21 3.29%
Named Entity Recognition 18 2.82%
Information Retrieval 15 2.35%
Machine Translation 14 2.19%
Speech Recognition 13 2.03%
NER 12 1.88%


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign