Tackling Catastrophic Forgetting and Background Shift in Continual Semantic Segmentation

29 Jun 2021  ·  Arthur Douillard, Yifu Chen, Arnaud Dapogny, Matthieu Cord ·

Deep learning approaches are nowadays ubiquitously used to tackle computer vision tasks such as semantic segmentation, requiring large datasets and substantial computational power. Continual learning for semantic segmentation (CSS) is an emerging trend that consists in updating an old model by sequentially adding new classes. However, continual learning methods are usually prone to catastrophic forgetting. This issue is further aggravated in CSS where, at each step, old classes from previous iterations are collapsed into the background. In this paper, we propose Local POD, a multi-scale pooling distillation scheme that preserves long- and short-range spatial relationships at feature level. Furthermore, we design an entropy-based pseudo-labelling of the background w.r.t. classes predicted by the old model to deal with background shift and avoid catastrophic forgetting of the old classes. Finally, we introduce a novel rehearsal method that is particularly suited for segmentation. Our approach, called PLOP, significantly outperforms state-of-the-art methods in existing CSS scenarios, as well as in newly proposed challenging benchmarks.

PDF Abstract

Results from the Paper

Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Overlapped 15-1 PASCAL VOC 2012 PLOPLong mIoU 61.21 # 6
Overlapped 10-1 PASCAL VOC 2012 PLOPLong mIoU 40.83 # 6
Overlapped 15-5 PASCAL VOC 2012 PLOPLong Mean IoU (val) 69.37 # 10


No methods listed for this paper. Add relevant methods here