Perturbated Gradients Updating within Unit Space for Deep Learning

1 Oct 2021  ·  Ching-Hsun. Tseng, Liu-Hsueh. Cheng, Shin-Jye. Lee, Xiaojun Zeng ·

In deep learning, optimization plays a vital role. By focusing on image classification, this work investigates the pros and cons of the widely used optimizers, and proposes a new optimizer: Perturbated Unit Gradient Descent (PUGD) algorithm with extending normalized gradient operation in tensor within perturbation to update in unit space. Via a set of experiments and analyses, we show that PUGD is locally bounded updating, which means the updating from time to time is controlled. On the other hand, PUGD can push models to a flat minimum, where the error remains approximately constant, not only because of the nature of avoiding stationary points in gradient normalization but also by scanning sharpness in the unit ball. From a series of rigorous experiments, PUGD helps models to gain a state-of-the-art Top-1 accuracy in Tiny ImageNet and competitive performances in CIFAR- {10, 100}. We open-source our code at link: https://github.com/hanktseng131415go/PUGD.

PDF Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Uses Extra
Training Data
Benchmark
Image Classification CIFAR-10 ViT-B/16 (PUGD) Percentage correct 99.13 # 12
Image Classification CIFAR-100 ViT-B/16 (PUGD) Percentage correct 93.95 # 6
Image Classification Tiny ImageNet Classification DeiT-B/16 (PUGD) Validation Acc 91.02% # 5
Image Classification Tiny ImageNet Classification ViT-B/16 (PUGD) Validation Acc 90.74% # 7

Methods