Learning Rate Schedules

Linear Warmup With Linear Decay

Linear Warmup With Linear Decay is a learning rate schedule in which we increase the learning rate linearly for $n$ updates and then linearly decay afterwards.

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Language Modelling 72 11.37%
Question Answering 26 4.11%
Sentiment Analysis 24 3.79%
Text Classification 18 2.84%
Named Entity Recognition 17 2.69%
Machine Translation 17 2.69%
NER 13 2.05%
Text Generation 12 1.90%
Knowledge Distillation 11 1.74%

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories