Continual Learning Using Pseudo-Replay via Latent Space Sampling

29 Sep 2021  ·  Gyuhak Kim, Sepideh Esmaeilpour, Zixuan Ke, Tatsuya Konishi, Bing Liu ·

This paper investigates continual learning in the setting of class-incremental learning (CIL). Although numerous techniques have been proposed, CIL remains to be a highly challenging problem due to catastrophic forgetting (CF). However, so far few existing techniques have made use of pre-trained image feature extractors. In this paper, we propose to use a recently reported strong pre-trained feature extractor called CLIP and also propose a novel and yet simple pseudo-replay method to deal with CF. The proposed method is called PLS. Unlike the popular pseudo-replay approach that builds data generators to generate pseudo previous task data, PLS works in the latent space by sampling pseudo feature representations of previous tasks from the last layer of the pre-trained feature extractor. PLS is not only simple and efficient but also does not invade data privacy due to the fact that it works in the latent feature space. Experimental results show that the proposed method PLS outperforms state-of-the-art baselines by a large margin, where both PLS and the baselines leverage the CLIP pre-trained image feature extractor.

PDF Abstract

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods