Linear Warmup is a learning rate schedule where we linearly increase the learning rate from a low rate to a constant rate thereafter. This reduces volatility in the early stages of training.
Image Credit: Chengwei Zhang
|🤖 No Components Found||You can add them if they exist; e.g. Mask R-CNN uses RoIAlign|