no code implementations • 22 Dec 2023 • Paul Stoewer, Achim Schilling, Andreas Maier, Patrick Krauss
Cognitive maps, as represented by the entorhinal-hippocampal complex in the brain, organize and retrieve context from memories, suggesting that large language models (LLMs) like ChatGPT could harness similar architectures to function as a high-level processing center, akin to how the hippocampus operates within the cortex hierarchy.
no code implementations • 4 Jul 2023 • Paul Stoewer, Achim Schilling, Andreas Maier, Patrick Krauss
The human brain possesses the extraordinary capability to contextualize the information it receives from our environment.
no code implementations • 15 Feb 2023 • Kishore Surendra, Achim Schilling, Paul Stoewer, Andreas Maier, Patrick Krauss
Strikingly, we find that the internal representations of nine-word input sequences cluster according to the word class of the tenth word to be predicted as output, even though the neural network did not receive any explicit information about syntactic rules or word classes during training.
no code implementations • 28 Oct 2022 • Paul Stoewer, Achim Schilling, Andreas Maier, Patrick Krauss
The neural network successfully learns the similarities between different animal species, and constructs a cognitive map of 'animal space' based on the principle of successor representations with an accuracy of around 30% which is near to the theoretical maximum regarding the fact that all animal species have more than one possible successor, i. e. nearest neighbor in feature space.
no code implementations • 22 Feb 2022 • Paul Stoewer, Christian Schlieker, Achim Schilling, Claus Metzner, Andreas Maier, Patrick Krauss
We conclude that cognitive maps and neural network-based successor representations of structured knowledge provide a promising way to overcome some of the short comings of deep learning towards artificial general intelligence.