Exponential Decay is a learning rate schedule where we decay the learning rate with more iterations using an exponential function:
$$ \text{lr} = \text{lr}_{0}\exp\left(-kt\right) $$
Image Credit: Suki Lau
Paper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Image Classification | 12 | 9.68% |
Computational Efficiency | 5 | 4.03% |
Reinforcement Learning | 5 | 4.03% |
Classification | 5 | 4.03% |
Time Series Analysis | 4 | 3.23% |
Large Language Model | 3 | 2.42% |
Quantum Machine Learning | 3 | 2.42% |
Multi-agent Reinforcement Learning | 3 | 2.42% |
Graph Neural Network | 3 | 2.42% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |