About

Continual Learning is a concept to learn a model for a large number of tasks sequentially without forgetting knowledge obtained from the preceding tasks, where the data in the old tasks are not available any more during training new ones.

Source: Continual Learning by Asymmetric Loss Approximation with Single-Side Overestimation

Source: Lifelong Machine Learning

Source: Continual lifelong learning with neural networks: A review

Benchmarks

TREND DATASET BEST METHOD PAPER TITLE PAPER CODE COMPARE

Subtasks

Datasets

Greatest papers with code

A Combinatorial Perspective on Transfer Learning

NeurIPS 2020 deepmind/deepmind-research

Our main postulate is that the combination of task segmentation, modular learning and memory-based ensembling can give rise to generalization on an exponentially growing number of unseen tasks.

CONTINUAL LEARNING TRANSFER LEARNING

Continual Unsupervised Representation Learning

NeurIPS 2019 deepmind/deepmind-research

Continual learning aims to improve the ability of modern learning systems to deal with non-stationary distributions, typically by attempting to learn a series of tasks sequentially.

CONTINUAL LEARNING OMNIGLOT UNSUPERVISED REPRESENTATION LEARNING

River: machine learning for streaming data in Python

8 Dec 2020online-ml/river

It is the result from the merger of the two most popular packages for stream learning in Python: Creme and scikit-multiflow.

CONTINUAL LEARNING

Three scenarios for continual learning

15 Apr 2019GMvandeVen/continual-learning

Standard artificial neural networks suffer from the well-known issue of catastrophic forgetting, making continual or lifelong learning difficult for machine learning.

CLASS-INCREMENTAL LEARNING INCREMENTAL LEARNING

Generative replay with feedback connections as a general strategy for continual learning

27 Sep 2018GMvandeVen/continual-learning

A major obstacle to developing artificial intelligence applications capable of true lifelong learning is that artificial neural networks quickly or catastrophically forget previously learned tasks when trained on a new one.

CONTINUAL LEARNING

Fast and Flexible Multi-Task Classification Using Conditional Neural Adaptive Processes

NeurIPS 2019 google-research/meta-dataset

We introduce a conditional neural process based approach to the multi-task classification setting for this purpose, and establish connections to the meta-learning and few-shot learning literature.

ACTIVE LEARNING CONTINUAL LEARNING FEW-SHOT LEARNING IMAGE CLASSIFICATION TRANSFER LEARNING

Gradient Episodic Memory for Continual Learning

NeurIPS 2017 facebookresearch/GradientEpisodicMemory

One major obstacle towards AI is the poor ability of models to solve new problems quicker, and without forgetting previously acquired knowledge.

CONTINUAL LEARNING

Avalanche: an End-to-End Library for Continual Learning

1 Apr 2021ContinualAI/avalanche

Learning continually from non-stationary data streams is a long-standing goal and a challenging problem in machine learning.

CONTINUAL LEARNING

Dreaming to Distill: Data-free Knowledge Transfer via DeepInversion

CVPR 2020 NVlabs/DeepInversion

We introduce DeepInversion, a new method for synthesizing images from the image distribution used to train a deep neural network.

CONTINUAL LEARNING NETWORK PRUNING TRANSFER LEARNING

Re-evaluating Continual Learning Scenarios: A Categorization and Case for Strong Baselines

30 Oct 2018GT-RIPL/Continual-Learning-Benchmark

Continual learning has received a great deal of attention recently with several approaches being proposed.

CONTINUAL LEARNING L2 REGULARIZATION