Class Incremental Learning

236 papers with code • 6 benchmarks • 1 datasets

Incremental learning of a sequence of tasks when the task-ID is not available at test time.

Libraries

Use these libraries to find Class Incremental Learning models and implementations
21 papers
1,722
16 papers
744
9 papers
512
3 papers
493
See all 5 libraries.

Datasets


Most implemented papers

Overcoming catastrophic forgetting in neural networks

ContinualAI/avalanche 2 Dec 2016

The ability to learn tasks in a sequential fashion is crucial to the development of artificial intelligence.

Supervised Contrastive Learning

google-research/google-research NeurIPS 2020

Contrastive learning applied to self-supervised representation learning has seen a resurgence in recent years, leading to state of the art performance in the unsupervised training of deep image models.

Learning without Forgetting

ContinualAI/avalanche 29 Jun 2016

We propose our Learning without Forgetting method, which uses only new task data to train the network while preserving the original capabilities.

iCaRL: Incremental Classifier and Representation Learning

srebuffi/iCaRL CVPR 2017

A major open problem on the road to artificial intelligence is the development of incrementally learning systems that learn about more and more concepts over time from a stream of data.

Three scenarios for continual learning

GMvandeVen/continual-learning 15 Apr 2019

Standard artificial neural networks suffer from the well-known issue of catastrophic forgetting, making continual or lifelong learning difficult for machine learning.

On Tiny Episodic Memories in Continual Learning

facebookresearch/agem 27 Feb 2019

But for a successful knowledge transfer, the learner needs to remember how to perform previous tasks.

Continual Learning with Deep Generative Replay

ContinualAI/avalanche NeurIPS 2017

Attempts to train a comprehensive artificial intelligence capable of solving multiple tasks have been impeded by a chronic problem called catastrophic forgetting.

Rehearsal-Free Continual Learning over Small Non-I.I.D. Batches

ContinualAI/avalanche 8 Jul 2019

Ideally, continual learning should be triggered by the availability of short videos of single objects and performed on-line on on-board hardware with fine-grained updates.

A Simple Unified Framework for Detecting Out-of-Distribution Samples and Adversarial Attacks

pokaxpoka/deep_Mahalanobis_detector NeurIPS 2018

Detecting test samples drawn sufficiently far away from the training distribution statistically or adversarially is a fundamental requirement for deploying a good classifier in many real-world machine learning applications.

Gradient based sample selection for online continual learning

rahafaljundi/Gradient-based-Sample-Selection NeurIPS 2019

To prevent forgetting, a replay buffer is usually employed to store the previous data for the purpose of rehearsal.