Delving into Transformer for Incremental Semantic Segmentation

18 Nov 2022  ·  Zekai Xu, Mingyi Zhang, Jiayue Hou, Xing Gong, Chuan Wen, Chengjie Wang, Junge Zhang ·

Incremental semantic segmentation(ISS) is an emerging task where old model is updated by incrementally adding new classes. At present, methods based on convolutional neural networks are dominant in ISS. However, studies have shown that such methods have difficulty in learning new tasks while maintaining good performance on old ones (catastrophic forgetting). In contrast, a Transformer based method has a natural advantage in curbing catastrophic forgetting due to its ability to model both long-term and short-term tasks. In this work, we explore the reasons why Transformer based architecture are more suitable for ISS, and accordingly propose propose TISS, a Transformer based method for Incremental Semantic Segmentation. In addition, to better alleviate catastrophic forgetting while preserving transferability on ISS, we introduce two patch-wise contrastive losses to imitate similar features and enhance feature diversity respectively, which can further improve the performance of TISS. Under extensive experimental settings with Pascal-VOC 2012 and ADE20K datasets, our method significantly outperforms state-of-the-art incremental semantic segmentation methods.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods