Search Results for author: Christian Tetzlaff

Found 6 papers, 1 papers with code

Synaptic Diversity in ANNs Can Facilitate Faster Learning

no code implementations29 Sep 2021 Martin Hofmann, Moritz F. P. Becker, Christian Tetzlaff, Patrick Mäder

Various advancements in artificial neural networks (ANNs) are inspired by biological concepts, e. g., the artificial neuron, an efficient model of biological nerve cells demonstrating learning capabilities on large amounts of data.

Continual Learning with Memory Cascades

no code implementations NeurIPS Workshop ICBINB 2021 David Kappel, Franscesco Negri, Christian Tetzlaff

This general formulation allows us to use the model also for online learning where no knowledge about task switching times is given to the network.

Continual Learning Permuted-MNIST

A synapse-centric account of the free energy principle

no code implementations23 Mar 2021 David Kappel, Christian Tetzlaff

The free energy principle (FEP) is a mathematical framework that describes how biological systems self-organize and survive in their environment.

Robust trajectory generation for robotic control on the neuromorphic research chip Loihi

1 code implementation26 Aug 2020 Carlo Michaelis, Andrew B. Lehr, Christian Tetzlaff

With this, we show that the anisotropic network on Loihi reliably encodes sequential patterns of neural activity, each representing a robotic action, and that the patterns allow the generation of multidimensional trajectories on control-relevant timescales.

Transfer entropy-based feedback improves performance in artificial neural networks

no code implementations13 Jun 2017 Sebastian Herzog, Christian Tetzlaff, Florentin Wörgötter

The structure of the majority of modern deep neural networks is characterized by uni- directional feed-forward connectivity across a very large number of layers.

Cannot find the paper you are looking for? You can Submit a new open access paper.