Overlapped 10-1

9 papers with code • 2 benchmarks • 2 datasets

This task has no description! Would you like to contribute one?

Most implemented papers

Learning without Forgetting

ContinualAI/avalanche 29 Jun 2016

We propose our Learning without Forgetting method, which uses only new task data to train the network while preserving the original capabilities.

Incremental Learning Techniques for Semantic Segmentation

LTTM/IL-SemSegm 31 Jul 2019

To tackle this task we propose to distill the knowledge of the previous model to retain the information about previously learned classes, whilst updating the current model to learn the new ones.

Modeling the Background for Incremental Learning in Semantic Segmentation

fcdl94/MiB CVPR 2020

Current strategies fail on this task because they do not consider a peculiar aspect of semantic segmentation: since each training step provides annotation only for a subset of all possible classes, pixels of the background class (i. e. pixels that do not belong to any other classes) exhibit a semantic distribution shift.

PLOP: Learning without Forgetting for Continual Semantic Segmentation

arthurdouillard/CVPR2021_PLOP CVPR 2021

classes predicted by the old model to deal with background shift and avoid catastrophic forgetting of the old classes.

SSUL: Semantic Segmentation with Unknown Label for Exemplar-based Class-Incremental Learning

clovaai/SSUL NeurIPS 2021

While the recent CISS algorithms utilize variants of the knowledge distillation (KD) technique to tackle the problem, they failed to fully address the critical challenges in CISS causing the catastrophic forgetting; the semantic drift of the background class and the multi-label prediction issue.

Tackling Catastrophic Forgetting and Background Shift in Continual Semantic Segmentation

arthurdouillard/CVPR2021_PLOP 29 Jun 2021

classes predicted by the old model to deal with background shift and avoid catastrophic forgetting of the old classes.

Representation Compensation Networks for Continual Semantic Segmentation

zhangchbin/rcil CVPR 2022

In this work, we study the continual semantic segmentation problem, where the deep neural networks are required to incorporate new classes continually without catastrophic forgetting.

SATS: Self-Attention Transfer for Continual Semantic Segmentation

QIU023/SATS_Continual_Semantic_Seg 15 Mar 2022

Considering that pixels belonging to the same class in each image often share similar visual properties, a class-specific region pooling is applied to provide more efficient relationship information for knowledge transfer.

Attribution-aware Weight Transfer: A Warm-Start Initialization for Class-Incremental Semantic Segmentation

dfki-av/awt-for-ciss 13 Oct 2022

In class-incremental semantic segmentation (CISS), deep learning architectures suffer from the critical problems of catastrophic forgetting and semantic background shift.