Few-Shot Class-Incremental Learning

52 papers with code • 3 benchmarks • 3 datasets

This task has no description! Would you like to contribute one?

Most implemented papers

Constrained Few-shot Class-incremental Learning

ibm/constrained-fscil CVPR 2022

Moreover, it is imperative that such learning must respect certain memory and computational constraints such as (i) training samples are limited to only a few per class, (ii) the computational cost of learning a novel class remains constant, and (iii) the memory footprint of the model grows at most linearly with the number of classes observed.

On the Soft-Subnetwork for Few-shot Class Incremental Learning

ihaeyong/softnet-fscil 15 Sep 2022

Inspired by Regularized Lottery Ticket Hypothesis (RLTH), which hypothesizes that there exist smooth (non-binary) subnetworks within a dense network that achieve the competitive performance of the dense network, we propose a few-shot class incremental learning (FSCIL) method referred to as \emph{Soft-SubNetworks (SoftNet)}.

Neural Collapse Terminus: A Unified Solution for Class Incremental Learning and Its Variants

neuralcollapseapplications/unicil 3 Aug 2023

Beyond the normal case, long-tail class incremental learning and few-shot class incremental learning are also proposed to consider the data imbalance and data scarcity, respectively, which are common in real-world implementations and further exacerbate the well-known problem of catastrophic forgetting.

Continual Learning: Forget-free Winning Subnetworks for Video Representations

ihaeyong/pfnr 19 Dec 2023

Inspired by the Lottery Ticket Hypothesis (LTH), which highlights the existence of efficient subnetworks within larger, dense networks, a high-performing Winning Subnetwork (WSN) in terms of task performance under appropriate sparsity conditions is considered for various continual learning tasks.

A streamlined Approach to Multimodal Few-Shot Class Incremental Learning for Fine-Grained Datasets

tldoan/clip_m3 10 Mar 2024

Few-shot Class-Incremental Learning (FSCIL) poses the challenge of retaining prior knowledge while learning from limited new data streams, all without overfitting.

Few-Shot Class-Incremental Learning

xyutao/fscil CVPR 2020

FSCIL requires CNN models to incrementally learn new classes from very few labelled samples, without forgetting the previously learned ones.

Few-Shot Incremental Learning with Continually Evolved Classifiers

icoz69/cec-cvpr2021 CVPR 2021

First, we adopt a simple but effective decoupled learning strategy of representations and classifiers that only the classifiers are updated in each incremental session, which avoids knowledge forgetting in the representations.

Self-Promoted Prototype Refinement for Few-Shot Class-Incremental Learning

zhukaii/SPPR CVPR 2021

Few-shot class-incremental learning is to recognize the new classes given few samples and not forget the old classes.

Subspace Regularizers for Few-Shot Class Incremental Learning

feyzaakyurek/subspace-reg ICLR 2022

The key to this approach is a new family of subspace regularization schemes that encourage weight vectors for new classes to lie close to the subspace spanned by the weights of existing classes.

Overcoming Catastrophic Forgetting in Incremental Few-Shot Learning by Finding Flat Minima

moukamisama/f2m NeurIPS 2021

Our study shows that existing methods severely suffer from catastrophic forgetting, a well-known problem in incremental learning, which is aggravated due to data scarcity and imbalance in the few-shot setting.