Disjoint 10-1

6 papers with code • 1 benchmarks • 0 datasets

This task has no description! Would you like to contribute one?

Most implemented papers

Learning without Forgetting

ContinualAI/avalanche 29 Jun 2016

We propose our Learning without Forgetting method, which uses only new task data to train the network while preserving the original capabilities.

Incremental Learning Techniques for Semantic Segmentation

LTTM/IL-SemSegm 31 Jul 2019

To tackle this task we propose to distill the knowledge of the previous model to retain the information about previously learned classes, whilst updating the current model to learn the new ones.

Modeling the Background for Incremental Learning in Semantic Segmentation

fcdl94/MiB CVPR 2020

Current strategies fail on this task because they do not consider a peculiar aspect of semantic segmentation: since each training step provides annotation only for a subset of all possible classes, pixels of the background class (i. e. pixels that do not belong to any other classes) exhibit a semantic distribution shift.

PLOP: Learning without Forgetting for Continual Semantic Segmentation

arthurdouillard/CVPR2021_PLOP CVPR 2021

classes predicted by the old model to deal with background shift and avoid catastrophic forgetting of the old classes.

SSUL: Semantic Segmentation with Unknown Label for Exemplar-based Class-Incremental Learning

clovaai/SSUL NeurIPS 2021

While the recent CISS algorithms utilize variants of the knowledge distillation (KD) technique to tackle the problem, they failed to fully address the critical challenges in CISS causing the catastrophic forgetting; the semantic drift of the background class and the multi-label prediction issue.

Representation Compensation Networks for Continual Semantic Segmentation

zhangchbin/rcil CVPR 2022

In this work, we study the continual semantic segmentation problem, where the deep neural networks are required to incorporate new classes continually without catastrophic forgetting.