Search Results for author: Patrick Krauss

Found 16 papers, 0 papers with code

Multi-Modal Cognitive Maps based on Neural Networks trained on Successor Representations

no code implementations22 Dec 2023 Paul Stoewer, Achim Schilling, Andreas Maier, Patrick Krauss

Cognitive maps, as represented by the entorhinal-hippocampal complex in the brain, organize and retrieve context from memories, suggesting that large language models (LLMs) like ChatGPT could harness similar architectures to function as a high-level processing center, akin to how the hippocampus operates within the cortex hierarchy.

Hippocampus Word Embeddings

Beyond Labels: Advancing Cluster Analysis with the Entropy of Distance Distribution (EDD)

no code implementations28 Nov 2023 Claus Metzner, Achim Schilling, Patrick Krauss

In the evolving landscape of data science, the accurate quantification of clustering in high-dimensional data sets remains a significant challenge, especially in the absence of predefined labels.

Clustering

Conceptual Cognitive Maps Formation with Neural Successor Networks and Word Embeddings

no code implementations4 Jul 2023 Paul Stoewer, Achim Schilling, Andreas Maier, Patrick Krauss

The human brain possesses the extraordinary capability to contextualize the information it receives from our environment.

Word Embeddings

Word class representations spontaneously emerge in a deep neural network trained on next word prediction

no code implementations15 Feb 2023 Kishore Surendra, Achim Schilling, Paul Stoewer, Andreas Maier, Patrick Krauss

Strikingly, we find that the internal representations of nine-word input sequences cluster according to the word class of the tenth word to be predicted as output, even though the neural network did not receive any explicit information about syntactic rules or word classes during training.

Language Acquisition

Quantifying and maximizing the information flux in recurrent neural networks

no code implementations30 Jan 2023 Claus Metzner, Marius E. Yamakou, Dennis Voelkl, Achim Schilling, Patrick Krauss

We find that in networks with moderately strong connections, the mutual information $I$ is approximately a monotonic transformation of the root-mean-square averaged Pearson correlations between neuron-pairs, a quantity that can be efficiently computed even in large systems.

Extracting continuous sleep depth from EEG data without machine learning

no code implementations17 Jan 2023 Claus Metzner, Achim Schilling, Maximilian Traxdorf, Holger Schulze, Konstantin Tziridis, Patrick Krauss

The human sleep-cycle has been divided into discrete sleep stages that can be recognized in electroencephalographic (EEG) and other bio-signals by trained specialists or machine learning systems.

Clustering EEG

Neural Network based Formation of Cognitive Maps of Semantic Spaces and the Emergence of Abstract Concepts

no code implementations28 Oct 2022 Paul Stoewer, Achim Schilling, Andreas Maier, Patrick Krauss

The neural network successfully learns the similarities between different animal species, and constructs a cognitive map of 'animal space' based on the principle of successor representations with an accuracy of around 30% which is near to the theoretical maximum regarding the fact that all animal species have more than one possible successor, i. e. nearest neighbor in feature space.

Classification at the Accuracy Limit -- Facing the Problem of Data Ambiguity

no code implementations4 Jun 2022 Claus Metzner, Achim Schilling, Maximilian Traxdorf, Konstantin Tziridis, Holger Schulze, Patrick Krauss

Remarkably, the accuracy limit is not affected by applying non-linear transformations to the data, even if these transformations are non-reversible and drastically reduce the information content of the input data.

Dimensionality Reduction EEG

Neural Network based Successor Representations of Space and Language

no code implementations22 Feb 2022 Paul Stoewer, Christian Schlieker, Achim Schilling, Claus Metzner, Andreas Maier, Patrick Krauss

We conclude that cognitive maps and neural network-based successor representations of structured knowledge provide a promising way to overcome some of the short comings of deep learning towards artificial general intelligence.

Dynamical Phases and Resonance Phenomena in Information-Processing Recurrent Neural Networks

no code implementations5 Aug 2021 Claus Metzner, Patrick Krauss

Moreover, we find a completely new type of resonance phenomenon, called 'Import Resonance' (IR), where the information import shows a maximum, i. e. a peak-like dependence on the coupling strength between the RNN and its input.

Towards a Cognitive Computational Neuroscience of Auditory Phantom Perceptions

no code implementations5 Oct 2020 Patrick Krauss, Achim Schilling

In order to gain a mechanistic understanding of how tinnitus emerges in the brain, we must build biologically plausible computational models that mimic both tinnitus development and perception, and test the tentative models with brain and behavioral experiments.

Will we ever have Conscious Machines?

no code implementations31 Mar 2020 Patrick Krauss, Andreas Maier

The question of whether artificial beings or machines could become self-aware or consciousness has been a philosophical question for centuries.

Sparsity through evolutionary pruning prevents neuronal networks from overfitting

no code implementations7 Nov 2019 Richard C. Gerum, André Erpenbeck, Patrick Krauss, Achim Schilling

We conclude that sparsity is a central property of neural networks and should be considered for modern Machine learning approaches.

BIG-bench Machine Learning Decision Making

How deep is deep enough? -- Quantifying class separability in the hidden layers of deep neural networks

no code implementations5 Nov 2018 Achim Schilling, Claus Metzner, Jonas Rietsch, Richard Gerum, Holger Schulze, Patrick Krauss

Deep neural networks typically outperform more traditional machine learning models in their ability to classify complex data, and yet is not clear how the individual hidden layers of a deep network contribute to the overall classification performance.

Cannot find the paper you are looking for? You can Submit a new open access paper.