Search Results for author: Claus Metzner

Found 10 papers, 0 papers with code

Beyond Labels: Advancing Cluster Analysis with the Entropy of Distance Distribution (EDD)

no code implementations28 Nov 2023 Claus Metzner, Achim Schilling, Patrick Krauss

In the evolving landscape of data science, the accurate quantification of clustering in high-dimensional data sets remains a significant challenge, especially in the absence of predefined labels.

Clustering

Quantifying and maximizing the information flux in recurrent neural networks

no code implementations30 Jan 2023 Claus Metzner, Marius E. Yamakou, Dennis Voelkl, Achim Schilling, Patrick Krauss

We find that in networks with moderately strong connections, the mutual information $I$ is approximately a monotonic transformation of the root-mean-square averaged Pearson correlations between neuron-pairs, a quantity that can be efficiently computed even in large systems.

Extracting continuous sleep depth from EEG data without machine learning

no code implementations17 Jan 2023 Claus Metzner, Achim Schilling, Maximilian Traxdorf, Holger Schulze, Konstantin Tziridis, Patrick Krauss

The human sleep-cycle has been divided into discrete sleep stages that can be recognized in electroencephalographic (EEG) and other bio-signals by trained specialists or machine learning systems.

Clustering EEG

Classification at the Accuracy Limit -- Facing the Problem of Data Ambiguity

no code implementations4 Jun 2022 Claus Metzner, Achim Schilling, Maximilian Traxdorf, Konstantin Tziridis, Holger Schulze, Patrick Krauss

Remarkably, the accuracy limit is not affected by applying non-linear transformations to the data, even if these transformations are non-reversible and drastically reduce the information content of the input data.

Dimensionality Reduction EEG

Neural Network based Successor Representations of Space and Language

no code implementations22 Feb 2022 Paul Stoewer, Christian Schlieker, Achim Schilling, Claus Metzner, Andreas Maier, Patrick Krauss

We conclude that cognitive maps and neural network-based successor representations of structured knowledge provide a promising way to overcome some of the short comings of deep learning towards artificial general intelligence.

Dynamical Phases and Resonance Phenomena in Information-Processing Recurrent Neural Networks

no code implementations5 Aug 2021 Claus Metzner, Patrick Krauss

Moreover, we find a completely new type of resonance phenomenon, called 'Import Resonance' (IR), where the information import shows a maximum, i. e. a peak-like dependence on the coupling strength between the RNN and its input.

Detecting long-range interactions between migrating cells

no code implementations19 Mar 2020 Claus Metzner, Franziska Hörsch, Christoph Mark, Tina Czerwinski, Alexander Winterl, Caroline Voskens, Ben Fabry

By contrast, we find attractive interactions between NK cells and an IL-15-secreting variant of K562 tumor cells.

Detecting long-range attraction between migrating cells based on p-value distributions

no code implementations20 Jun 2019 Claus Metzner

Immune cells have evolved to recognize and eliminate pathogens, and the efficiency of this process can be measured in a Petri dish.

How deep is deep enough? -- Quantifying class separability in the hidden layers of deep neural networks

no code implementations5 Nov 2018 Achim Schilling, Claus Metzner, Jonas Rietsch, Richard Gerum, Holger Schulze, Patrick Krauss

Deep neural networks typically outperform more traditional machine learning models in their ability to classify complex data, and yet is not clear how the individual hidden layers of a deep network contribute to the overall classification performance.

Cannot find the paper you are looking for? You can Submit a new open access paper.