The Role of Robust Generalization in Continual Learning: Better Transfer and Less Forgetting

21 Nov 2022  ·  Zenglin Shi, Ying Sun, Joo Hwee Lim, Mengmi Zhang ·

This paper considers learning a sequence of tasks continually with the objectives of generalizing over unseen data regardless of its distribution, accumulating knowledge and transferring knowledge across tasks. To the best of our knowledge, no existing technique can accomplish all of these objectives simultaneously. This paper proposes such a technique by investigating the role of robust generalization in Continual Learning (CL). Recent findings show that models trained to exhibit robust generalization not only generalize better, but also demonstrate improved transferability and tend to find flatter local minima. This motivates us to achieve robust generalization in each task in CL, facilitating learning a new task and reducing the risk of forgetting previously learned tasks. To achieve this, we propose a new online shape-texture self-distillation (STSD) method that learns both shape and texture representations for each task, improving robust generalization. Extensive experiments demonstrate that our approach can be easily combined with existing CL methods to improve generalization, encourage knowledge transfer, and reduce forgetting. We also show that our approach finds flatter local minima, further highlighting the importance of improving robust generalization in CL. Our proposed technique is a significant step forward in achieving the aforementioned CL objectives simultaneously.

PDF Abstract

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here