Search Results for author: David Kappel

Found 10 papers, 3 papers with code

A synapse-centric account of the free energy principle

no code implementations23 Mar 2021 David Kappel, Christian Tetzlaff

The free energy principle (FEP) is a mathematical framework that describes how biological systems self-organize and survive in their environment.

Embodied Synaptic Plasticity with Online Reinforcement learning

1 code implementation3 Mar 2020 Jacques Kaiser, Michael Hoff, Andreas Konle, J. Camilo Vasquez Tieck, David Kappel, Daniel Reichard, Anand Subramoney, Robert Legenstein, Arne Roennau, Wolfgang Maass, Rudiger Dillmann

We demonstrate this framework to evaluate Synaptic Plasticity with Online REinforcement learning (SPORE), a reward-learning rule based on synaptic sampling, on two visuomotor tasks: reaching and lane following.

Structural plasticity on an accelerated analog neuromorphic hardware system

no code implementations27 Dec 2019 Sebastian Billaudelle, Benjamin Cramer, Mihai A. Petrovici, Korbinian Schreiber, David Kappel, Johannes Schemmel, Karlheinz Meier

In computational neuroscience, as well as in machine learning, neuromorphic devices promise an accelerated and scalable alternative to neural network simulations.

Attention on Abstract Visual Reasoning

no code implementations14 Nov 2019 Lukas Hahne, Timo Lüddecke, Florentin Wörgötter, David Kappel

Our proposed hybrid model, represents an alternative on learning abstract relations using self-attention and demonstrates that the Transformer network is also well suited for abstract visual reasoning.

Program induction Relational Reasoning +1

Efficient Reward-Based Structural Plasticity on a SpiNNaker 2 Prototype

no code implementations20 Mar 2019 Yexin Yan, David Kappel, Felix Neumaerker, Johannes Partzsch, Bernhard Vogginger, Sebastian Hoeppner, Steve Furber, Wolfgang Maass, Robert Legenstein, Christian Mayr

Advances in neuroscience uncover the mechanisms employed by the brain to efficiently solve complex learning tasks with very limited resources.

Deep Rewiring: Training very sparse deep networks

5 code implementations ICLR 2018 Guillaume Bellec, David Kappel, Wolfgang Maass, Robert Legenstein

Neuromorphic hardware tends to pose limits on the connectivity of deep networks that one can run on them.

A dynamic connectome supports the emergence of stable computational function of neural circuits through reward-based learning

no code implementations13 Apr 2017 David Kappel, Robert Legenstein, Stefan Habenschuss, Michael Hsieh, Wolfgang Maass

These data are inconsistent with common models for network plasticity, and raise the questions how neural circuits can maintain a stable computational function in spite of these continuously ongoing processes, and what functional uses these ongoing processes might have.

CaMKII activation supports reward-based neural network optimization through Hamiltonian sampling

no code implementations1 Jun 2016 Zhaofei Yu, David Kappel, Robert Legenstein, Sen Song, Feng Chen, Wolfgang Maass

Our theoretical analysis shows that stochastic search could in principle even attain optimal network configurations by emulating one of the most well-known nonlinear optimization methods, simulated annealing.

Synaptic Sampling: A Bayesian Approach to Neural Network Plasticity and Rewiring

no code implementations NeurIPS 2015 David Kappel, Stefan Habenschuss, Robert Legenstein, Wolfgang Maass

We reexamine in this article the conceptual and mathematical framework for understanding the organization of plasticity in spiking neural networks.

Network Plasticity as Bayesian Inference

1 code implementation20 Apr 2015 David Kappel, Stefan Habenschuss, Robert Legenstein, Wolfgang Maass

General results from statistical learning theory suggest to understand not only brain computations, but also brain plasticity as probabilistic inference.

Bayesian Inference Learning Theory

Cannot find the paper you are looking for? You can Submit a new open access paper.