no code implementations • 29 Jul 2022 • Jonas Dippel, Matthias Lenga, Thomas Goerttler, Klaus Obermayer, Johannes Höhne
In this work, we investigate the impact of transfer learning for segmentation problems, being pixel-wise classification problems that can be tackled with encoder-decoder architectures.
no code implementations • 19 Jul 2022 • Thomas Goerttler, Klaus Obermayer
In transfer learning, only the last part of the networks - the so-called head - is often fine-tuned.
no code implementations • 10 Jun 2021 • Nicolas Roth, Pia Bideau, Olaf Hellwich, Martin Rolfs, Klaus Obermayer
We model this active scene exploration as a sequential decision making process.
1 code implementation • ICLR Workshop Learning_to_Learn 2021 • Thomas Goerttler, Klaus Obermayer
Besides their tremendous success in these tasks, it has still not been fully revealed yet, why it works so well.
1 code implementation • 17 Feb 2021 • Teresa Chouzouris, Nicolas Roth, Caglar Cakan, Klaus Obermayer
Optimal control inputs to nodes are determined by minimizing a cost functional that penalizes the deviations from a desired network dynamic, the control energy, and spatially non-sparse control inputs.
2 code implementations • 30 Nov 2020 • Caglar Cakan, Cristiana Dimulescu, Liliia Khakimova, Daniela Obst, Agnes Flöel, Klaus Obermayer
We address the mechanism of how SOs emerge and recruit large parts of the brain using a whole-brain model constructed from empirical connectivity data in which SOs are induced independently in each brain area by a local adaptation mechanism.
1 code implementation • 1 Oct 2019 • Vaios Laschos, Jan Tinapp, Klaus Obermayer
We propose a new algorithm that uses an auxiliary neural network to express the potential of the optimal transport map between two data distributions.
2 code implementations • 3 Jun 2019 • Caglar Cakan, Klaus Obermayer
We evaluate a reduced mean-field model of excitatory and inhibitory adaptive exponential integrate-and-fire (AdEx) neurons which can be used to efficiently study the effects of electrical stimulation on large neural populations.
1 code implementation • 29 Mar 2019 • Ivo Trowitzsch, Christopher Schymura, Dorothea Kolossa, Klaus Obermayer
This work presents an approach that robustly binds localization with the detection of sound events in a binaural robotic system.
Sound Audio and Speech Processing
no code implementations • 21 Feb 2019 • Ivo Trowitzsch, Jalil Taghia, Youssef Kashef, Klaus Obermayer
Computational auditory scene analysis is gaining interest in the last years.
no code implementations • 22 Dec 2016 • Wendelin Böhmer, Rong Guo, Klaus Obermayer
This paper investigates a type of instability that is linked to the greedy policy improvement in approximated reinforcement learning.
no code implementations • 19 Dec 2014 • Wendelin Böhmer, Klaus Obermayer
Many applications that use empirically estimated functions face a curse of dimensionality, because the integrals over most function classes must be approximated by sampling.
no code implementations • 8 Nov 2013 • Yun Shen, Michael J. Tobia, Tobias Sommer, Klaus Obermayer
We derive a family of risk-sensitive reinforcement learning methods for agents, who face sequential decision-making tasks in uncertain environments.
no code implementations • 28 Oct 2011 • Yun Shen, Wilhelm Stannat, Klaus Obermayer
We introduce a general framework for measuring risk in the context of Markov control processes with risk maps on general Borel spaces that generalize known concepts of risk measures in mathematical finance, operations research and behavioral economics.
no code implementations • NeurIPS 2009 • Arno Onken, Steffen Grünewälder, Klaus Obermayer
The linear correlation coefficient is typically used to characterize and analyze dependencies of neural spike counts.
no code implementations • NeurIPS 2008 • Arno Onken, Steffen Grünewälder, Matthias Munk, Klaus Obermayer
Furthermore, copulas place a wide range of dependence structures at the disposal and can be used to analyze higher order interactions.
no code implementations • NeurIPS 2008 • Klaus Wimmer, Marcel Stimberg, Robert Martin, Lars Schwabe, Jorge Mariño, James Schummers, David C. Lyon, Mriganka Sur, Klaus Obermayer
A quantitative analysis shows that the data provides strong evidence for a network model in which the afferent input is dominated by strong, balanced contributions of recurrent excitation and inhibition.