Paper

Realizing Continual Learning through Modeling a Learning System as a Fiber Bundle

A human brain is capable of continual learning by nature; however the current mainstream deep neural networks suffer from a phenomenon named catastrophic forgetting (i.e., learning a new set of patterns suddenly and completely would result in fully forgetting what has already been learned). In this paper we propose a generic learning model, which regards a learning system as a fiber bundle. By comparing the learning performance of our model with conventional ones whose neural networks are multilayer perceptrons through a variety of machine-learning experiments, we found our proposed model not only enjoys a distinguished capability of continual learning but also bears a high information capacity. In addition, we found in some learning scenarios the learning performance can be further enhanced by making the learning time-aware to mimic the episodic memory in human brain. Last but not least, we found that the properties of forgetting in our model correspond well to those of human memory. This work may shed light on how a human brain learns.

Results in Papers With Code
(↓ scroll down to see all results)