BaSIL: Learning Incrementally using a Bayesian Memory-Based Streaming Approach

1 Jan 2021  ·  Soumya Banerjee, Vinay P Namboodiri ·

A wide variety of methods have been developed to mitigate catastrophic forgetting of previously observed data in deep neural networks. However, these methods mainly focus on incremental batch learning. Consequently it requires a batch of samples to be available and this batch is visited many times during training. Experimentally we show that most such methods do not work if the input is obtained as a stream of data with each input instance being seen once. In this paper we propose a solution for that problem. Specifically, we propose a Bayesian streaming learning approach termed BaSIL for incremental learning. Our approach enables (i) learning from instances that are input one at a time (ii) input samples are seen by the learner only once (iii) the i.i.d assumption about data can be violated, i.e. there can be class based correlations in the input data. Our approach uses a memory-based streaming variational Bayes technique with efficient encoded feature representation. The use of a streaming variational Bayes approach enables use of a prior that is updated continually and enables better posterior distribution learning. We empirically demonstrate that our method substantially outperforms existing comparable incremental learning methods on different data ordering schemes and datasets.

PDF Abstract
No code implementations yet. Submit your code now

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here