Learning Muti-expert Distribution Calibration for Long-tailed Video Classification

22 May 2022  ·  Yufan Hu, Junyu Gao, Changsheng Xu ·

Most existing state-of-the-art video classification methods assume that the training data obey a uniform distribution. However, video data in the real world typically exhibit an imbalanced long-tailed class distribution, resulting in a model bias towards head class and relatively low performance on tail class. While the current long-tailed classification methods usually focus on image classification, adapting it to video data is not a trivial extension. We propose an end-to-end multi-expert distribution calibration method to address these challenges based on two-level distribution information. The method jointly considers the distribution of samples in each class (intra-class distribution) and the overall distribution of diverse data (inter-class distribution) to solve the issue of imbalanced data under long-tailed distribution. By modeling the two-level distribution information, the model can jointly consider the head classes and the tail classes and significantly transfer the knowledge from the head classes to improve the performance of the tail classes. Extensive experiments verify that our method achieves state-of-the-art performance on the long-tailed video classification task.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here