Learning Rate Schedules

Step Decay

Step Decay is a learning rate schedule that drops the learning rate by a factor every few epochs, where the number of epochs is a hyperparameter.

Image Credit: Suki Lau


Paper Code Results Date Stars


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign