Dynamic Loss For Robust Learning

22 Nov 2022  ·  Shenwang Jiang, Jianan Li, Jizhou Zhang, Ying Wang, Tingfa Xu ·

Label noise and class imbalance commonly coexist in real-world data. Previous works for robust learning, however, usually address either one type of the data biases and underperform when facing them both. To mitigate this gap, this work presents a novel meta-learning based dynamic loss that automatically adjusts the objective functions with the training process to robustly learn a classifier from long-tailed noisy data. Concretely, our dynamic loss comprises a label corrector and a margin generator, which respectively correct noisy labels and generate additive per-class classification margins by perceiving the underlying data distribution as well as the learning state of the classifier. Equipped with a new hierarchical sampling strategy that enriches a small amount of unbiased metadata with diverse and hard samples, the two components in the dynamic loss are optimized jointly through meta-learning and cultivate the classifier to well adapt to clean and balanced test data. Extensive experiments show our method achieves state-of-the-art accuracy on multiple real-world and synthetic datasets with various types of data biases, including CIFAR-10/100, Animal-10N, ImageNet-LT, and Webvision. Code will soon be publicly available.

PDF Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Learning with noisy labels ANIMAL Dynamic Loss Accuracy 86.5 # 6
Network Vgg19-BN # 1
ImageNet Pretrained NO # 1
Image Classification mini WebVision 1.0 Dynamic Loss (Inception-ResNet-v2) Top-1 Accuracy 80.12 # 9
Top-5 Accuracy 93.64 # 3
ImageNet Top-1 Accuracy 74.76 # 19
ImageNet Top-5 Accuracy 93.08 # 7

Methods