Learning Rate Schedules

Linear Warmup With Linear Decay

Linear Warmup With Linear Decay is a learning rate schedule in which we increase the learning rate linearly for $n$ updates and then linearly decay afterwards.

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
RAG 275 20.74%
Retrieval 206 15.54%
Question Answering 73 5.51%
Language Modelling 58 4.37%
Language Modeling 50 3.77%
Large Language Model 42 3.17%
Information Retrieval 24 1.81%
Text Classification 22 1.66%
Text Generation 20 1.51%

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories