class-incremental learning

63 papers with code • 0 benchmarks • 0 datasets

Incremental learning of a sequence of tasks when the task-ID is not available at test time.


Use these libraries to find class-incremental learning models and implementations
3 papers

Most implemented papers

Three scenarios for continual learning

GMvandeVen/continual-learning 15 Apr 2019

Standard artificial neural networks suffer from the well-known issue of catastrophic forgetting, making continual or lifelong learning difficult for machine learning.

A Simple Unified Framework for Detecting Out-of-Distribution Samples and Adversarial Attacks

pokaxpoka/deep_Mahalanobis_detector NeurIPS 2018

Detecting test samples drawn sufficiently far away from the training distribution statistically or adversarially is a fundamental requirement for deploying a good classifier in many real-world machine learning applications.

Class-incremental Learning via Deep Model Consolidation

mmasana/FACIL 19 Mar 2019

The idea is to first train a separate model only for the new classes, and then combine the two individual models trained on data of two distinct set of classes (old classes and new classes) via a novel double distillation training objective.

IL2M: Class Incremental Learning With Dual Memory

mmasana/FACIL ICCV 2019

This paper presents a class incremental learning (IL) method which exploits fine tuning and a dual memory to reduce the negative effect of catastrophic forgetting in image recognition.

Mnemonics Training: Multi-Class Incremental Learning without Forgetting

yaoyao-liu/mnemonics CVPR 2020

However, there is an inherent trade-off to effectively learning new concepts without catastrophic forgetting of previous ones.

Adaptive Aggregation Networks for Class-Incremental Learning

yaoyao-liu/class-incremental-learning CVPR 2021

Class-Incremental Learning (CIL) aims to learn a classification model with the number of classes increasing phase-by-phase.

A Strategy for an Uncompromising Incremental Learner

ragavvenkatesan/Incremental-GAN 2 May 2017

Using an implementation based on deep neural networks, we demonstrate that phantom sampling dramatically avoids catastrophic forgetting.

SupportNet: solving catastrophic forgetting in class incremental learning with support data

lykaust15/SupportNet 8 Jun 2018

A plain well-trained deep learning model often does not have the ability to learn new knowledge without forgetting the previously learned knowledge, which is known as catastrophic forgetting.

Continuous Learning in Single-Incremental-Task Scenarios

ContinualAI/avalanche 22 Jun 2018

It was recently shown that architectural, regularization and rehearsal strategies can be used to train deep models sequentially on a number of disjoint tasks without forgetting previously acquired knowledge.

Extending Pretrained Segmentation Networks with Additional Anatomical Structures

firatozdemir/LwfSeg-AeiSeg 12 Nov 2018

We propose a class-incremental segmentation framework for extending a deep network trained for some anatomical structure to yet another structure using a small incremental annotation set.