Distribution-Balanced Loss for Multi-Label Classification in Long-Tailed Datasets

ECCV 2020  ·  Tong Wu, Qingqiu Huang, Ziwei Liu, Yu Wang, Dahua Lin ·

We present a new loss function called Distribution-Balanced Loss for the multi-label recognition problems that exhibit long-tailed class distributions. Compared to conventional single-label classification problem, multi-label recognition problems are often more challenging due to two significant issues, namely the co-occurrence of labels and the dominance of negative labels (when treated as multiple binary classification problems). The Distribution-Balanced Loss tackles these issues through two key modifications to the standard binary cross-entropy loss: 1) a new way to re-balance the weights that takes into account the impact caused by label co-occurrence, and 2) a negative tolerant regularization to mitigate the over-suppression of negative labels. Experiments on both Pascal VOC and COCO show that the models trained with this new loss function achieve significant performance gains over existing methods. Code and models are available at: https://github.com/wutong16/DistributionBalancedLoss .

PDF Abstract ECCV 2020 PDF ECCV 2020 Abstract

Datasets


Introduced in the Paper:

COCO-MLT VOC-MLT

Used in the Paper:

MS COCO

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Long-tail Learning COCO-MLT DB Focal(ResNet-50) Average mAP 53.55 # 7
Long-tail Learning VOC-MLT DB Focal(ResNet-50) Average mAP 78.94 # 7

Methods


No methods listed for this paper. Add relevant methods here