Revisiting Catastrophic Forgetting in Class Incremental Learning

26 Jul 2021  ·  Zixuan Ni, Haizhou Shi, Siliang Tang, Longhui Wei, Qi Tian, Yueting Zhuang ·

Although the concept of catastrophic forgetting is straightforward, there is a lack of study on its causes. In this paper, we systematically explore and reveal three causes for catastrophic forgetting in Class Incremental Learning(CIL). From the perspective of representation learning,(i) intra-phase forgetting happens when the learner fails to correctly align the same-phase data as training proceeds and (ii) inter-phase confusion happens when the learner confuses the current-phase data with the previous-phase. From the task-specific point of view, the CIL model suffers from the problem of (iii) classifier deviation. After investigating existing strategies, we observe that there is a lack of study on how to prevent the inter-phase confusion. To initiate the research on this specific issue, we propose a simple yet effective framework, Contrastive Class Concentration for CIL (C4IL). Our framework leverages the class concentration effect of contrastive learning, yielding a representation distribution with better intra-class compactibility and inter-class separability. Empirically, we observe that C4IL significantly lowers the probability of inter-phase confusion and as a result improves the performance on multiple CIL settings of multiple datasets.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here