Task Agnostic Continual Learning via Meta Learning

12 Jun 2019Xu HeJakub SygnowskiAlexandre GalashovAndrei A. RusuYee Whye TehRazvan Pascanu

While neural networks are powerful function approximators, they suffer from catastrophic forgetting when the data distribution is not stationary. One particular formalism that studies learning under non-stationary distribution is provided by continual learning, where the non-stationarity is imposed by a sequence of distinct tasks... (read more)

PDF Abstract


No code implementations yet. Submit your code now

Results from the Paper

  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.