Paper

AHA! an 'Artificial Hippocampal Algorithm' for Episodic Machine Learning

The majority of ML research concerns slow, statistical learning of i.i.d. samples from large, labelled datasets. Animals do not learn this way. An enviable characteristic of animal learning is `episodic' learning - the ability to memorise a specific experience as a composition of existing concepts, after just one experience, without provided labels. The new knowledge can then be used to distinguish between similar experiences, to generalise between classes, and to selectively consolidate to long-term memory. The Hippocampus is known to be vital to these abilities. AHA is a biologically-plausible computational model of the Hippocampus. Unlike most machine learning models, AHA is trained without external labels and uses only local credit assignment. We demonstrate AHA in a superset of the Omniglot one-shot classification benchmark. The extended benchmark covers a wider range of known hippocampal functions by testing pattern separation, completion, and recall of original input. These functions are all performed within a single configuration of the computational model. Despite these constraints, image classification results are comparable to conventional deep convolutional ANNs.

Results in Papers With Code
(↓ scroll down to see all results)