Few-Shot Class-Incremental Learning

26 papers with code • 3 benchmarks • 3 datasets

This task has no description! Would you like to contribute one?

Most implemented papers

On the Soft-Subnetwork for Few-shot Class Incremental Learning

ihaeyong/softnet-fscil 15 Sep 2022

Inspired by Regularized Lottery Ticket Hypothesis (RLTH), which hypothesizes that there exist smooth (non-binary) subnetworks within a dense network that achieve the competitive performance of the dense network, we propose a few-shot class incremental learning (FSCIL) method referred to as \emph{Soft-SubNetworks (SoftNet)}.

Neural Collapse Terminus: A Unified Solution for Class Incremental Learning and Its Variants

neuralcollapseapplications/unicil 3 Aug 2023

Beyond the normal case, long-tail class incremental learning and few-shot class incremental learning are also proposed to consider the data imbalance and data scarcity, respectively, which are common in real-world implementations and further exacerbate the well-known problem of catastrophic forgetting.

Few-Shot Class-Incremental Learning

xyutao/fscil CVPR 2020

FSCIL requires CNN models to incrementally learn new classes from very few labelled samples, without forgetting the previously learned ones.

Few-Shot Incremental Learning with Continually Evolved Classifiers

icoz69/cec-cvpr2021 CVPR 2021

First, we adopt a simple but effective decoupled learning strategy of representations and classifiers that only the classifiers are updated in each incremental session, which avoids knowledge forgetting in the representations.

Self-Promoted Prototype Refinement for Few-Shot Class-Incremental Learning

zhukaii/SPPR CVPR 2021

Few-shot class-incremental learning is to recognize the new classes given few samples and not forget the old classes.

Subspace Regularizers for Few-Shot Class Incremental Learning

feyzaakyurek/subspace-reg ICLR 2022

The key to this approach is a new family of subspace regularization schemes that encourage weight vectors for new classes to lie close to the subspace spanned by the weights of existing classes.

Overcoming Catastrophic Forgetting in Incremental Few-Shot Learning by Finding Flat Minima

moukamisama/f2m NeurIPS 2021

Our study shows that existing methods severely suffer from catastrophic forgetting, a well-known problem in incremental learning, which is aggravated due to data scarcity and imbalance in the few-shot setting.

Graph Few-shot Class-incremental Learning

zhen-tan-dmml/gfcil 23 Dec 2021

The ability to incrementally learn new classes is vital to all real-world artificial intelligence systems.

Forward Compatible Few-Shot Class-Incremental Learning

zhoudw-zdw/cvpr22-fact CVPR 2022

Forward compatibility requires future new classes to be easily incorporated into the current model based on the current stage data, and we seek to realize it by reserving embedding space for future new classes.

Constrained Few-shot Class-incremental Learning

ibm/constrained-fscil CVPR 2022

Moreover, it is imperative that such learning must respect certain memory and computational constraints such as (i) training samples are limited to only a few per class, (ii) the computational cost of learning a novel class remains constant, and (iii) the memory footprint of the model grows at most linearly with the number of classes observed.