Learning Rate Schedules

Step Decay

Step Decay is a learning rate schedule that drops the learning rate by a factor every few epochs, where the number of epochs is a hyperparameter.

Image Credit: Suki Lau

Papers


Paper Code Results Date Stars

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories