Representation Compensation Networks for Continual Semantic Segmentation

In this work, we study the continual semantic segmentation problem, where the deep neural networks are required to incorporate new classes continually without catastrophic forgetting. We propose to use a structural re-parameterization mechanism, named representation compensation (RC) module, to decouple the representation learning of both old and new knowledge. The RC module consists of two dynamically evolved branches with one frozen and one trainable. Besides, we design a pooled cube knowledge distillation strategy on both spatial and channel dimensions to further enhance the plasticity and stability of the model. We conduct experiments on two challenging continual semantic segmentation scenarios, continual class segmentation and continual domain segmentation. Without any extra computational overhead and parameters during inference, our method outperforms state-of-the-art performance. The code is available at \url{https://github.com/zhangchbin/RCIL}.

PDF Abstract CVPR 2022 PDF CVPR 2022 Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Overlapped 100-5 ADE20K RCNet-101 mIoU 29.6 # 4
Overlapped 100-10 ADE20K RCNet-101 Mean IoU (test) 32.1 # 3
Overlapped 100-50 ADE20K RCNet-101 mIoU 34.5 # 2
Overlapped 50-50 ADE20K RCNet-101 mIoU 32.5 # 2
Domain 1-1 Cityscapes RCNet-101 mIoU 48.9 # 1
Domain 11-5 Cityscapes RCNet-101 mIoU 64.3 # 1
Domain 11-1 Cityscapes RCNet-101 mIoU 63.0 # 1
Disjoint 15-5 PASCAL VOC 2012 RCNet-101 Mean IoU 67.3 # 3
Overlapped 10-1 PASCAL VOC 2012 RCNet-101 mIoU 34.3 # 7
Overlapped 15-5 PASCAL VOC 2012 RCNet-101 Mean IoU (val) 72.4 # 4
Overlapped 15-1 PASCAL VOC 2012 RCNet-101 mIoU 59.4 # 7
Disjoint 15-1 PASCAL VOC 2012 RCNet-101 mIoU 54.7 # 3
Disjoint 10-1 PASCAL VOC 2012 RCNet-101 mIoU 18.2 # 3

Methods