Essentials for Class Incremental Learning

18 Feb 2021  ·  Sudhanshu Mittal, Silvio Galesso, Thomas Brox ·

Contemporary neural networks are limited in their ability to learn from evolving streams of training data. When trained sequentially on new or evolving tasks, their accuracy drops sharply, making them unsuitable for many real-world applications. In this work, we shed light on the causes of this well-known yet unsolved phenomenon - often referred to as catastrophic forgetting - in a class-incremental setup. We show that a combination of simple components and a loss that balances intra-task and inter-task learning can already resolve forgetting to the same extent as more complex measures proposed in literature. Moreover, we identify poor quality of the learned representation as another reason for catastrophic forgetting in class-IL. We show that performance is correlated with secondary class information (dark knowledge) learned by the model and it can be improved by an appropriate regularizer. With these lessons learned, class-incremental learning results on CIFAR-100 and ImageNet improve over the state-of-the-art by a large margin, while keeping the approach simple.

PDF Abstract

Datasets


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Incremental Learning CIFAR-100 - 50 classes + 10 steps of 5 classes CCIL-SD Average Incremental Accuracy 65.86 # 8
Incremental Learning CIFAR-100 - 50 classes + 5 steps of 10 classes CCIL-SD Average Incremental Accuracy 67.17 # 8
Incremental Learning ImageNet-100 - 50 classes + 10 steps of 5 classes CCIL-SD Average Incremental Accuracy 76.77 # 4
Incremental Learning ImageNet-100 - 50 classes + 5 steps of 10 classes CCIL-SD Average Incremental Accuracy 79.44 # 3
Incremental Learning ImageNet - 500 classes + 5 steps of 100 classes CCIL-SD Average Incremental Accuracy 68.04 # 2

Methods


No methods listed for this paper. Add relevant methods here