RSG: A Simple but Effective Module for Learning Imbalanced Datasets

Imbalanced datasets widely exist in practice and area great challenge for training deep neural models with agood generalization on infrequent classes. In this work, wepropose a new rare-class sample generator (RSG) to solvethis problem. RSG aims to generate some new samplesfor rare classes during training, and it has in particularthe following advantages: (1) it is convenient to use andhighly versatile, because it can be easily integrated intoany kind of convolutional neural network, and it works wellwhen combined with different loss functions, and (2) it isonly used during the training phase, and therefore, no ad-ditional burden is imposed on deep neural networks duringthe testing phase. In extensive experimental evaluations, weverify the effectiveness of RSG. Furthermore, by leveragingRSG, we obtain competitive results on Imbalanced CIFARand new state-of-the-art results on Places-LT, ImageNet-LT, and iNaturalist 2018. The source code is available at https://github.com/Jianf-Wang/RSG.

PDF Abstract CVPR 2021 PDF CVPR 2021 Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Long-tail Learning CIFAR-100-LT (ρ=100) LDAM-DRW-RSG Error Rate 55.5 # 49
Long-tail Learning CIFAR-100-LT (ρ=50) LDAM-DRW-RSG Error Rate 51.5 # 22
Long-tail Learning ImageNet-LT LDAM-DRS-RSG Top-1 Accuracy 51.8 # 44
Long-tail Learning iNaturalist 2018 LDAM-DRS-RSG Top-1 Accuracy 70.3% # 26
Long-tail Learning Places-LT LDAM-DRS-RSG Top-1 Accuracy 39.3 # 17

Methods


No methods listed for this paper. Add relevant methods here