63 papers with code • 0 benchmarks • 0 datasets
Incremental learning of a sequence of tasks when the task-ID is not available at test time.
These leaderboards are used to track progress in class-incremental learning
LibrariesUse these libraries to find class-incremental learning models and implementations
Detecting test samples drawn sufficiently far away from the training distribution statistically or adversarially is a fundamental requirement for deploying a good classifier in many real-world machine learning applications.
The idea is to first train a separate model only for the new classes, and then combine the two individual models trained on data of two distinct set of classes (old classes and new classes) via a novel double distillation training objective.
However, there is an inherent trade-off to effectively learning new concepts without catastrophic forgetting of previous ones.
Using an implementation based on deep neural networks, we demonstrate that phantom sampling dramatically avoids catastrophic forgetting.
A plain well-trained deep learning model often does not have the ability to learn new knowledge without forgetting the previously learned knowledge, which is known as catastrophic forgetting.
It was recently shown that architectural, regularization and rehearsal strategies can be used to train deep models sequentially on a number of disjoint tasks without forgetting previously acquired knowledge.
We propose a class-incremental segmentation framework for extending a deep network trained for some anatomical structure to yet another structure using a small incremental annotation set.