Search Results for author: Franz Scherr

Found 6 papers, 2 papers with code

Self-Supervised Learning Through Efference Copies

1 code implementation17 Oct 2022 Franz Scherr, Qinghai Guo, Timoleon Moraitis

Specifically, the brain also transforms the environment through efference, i. e. motor commands, however it sends to itself an EC of the full commands, i. e. more than a mere SSL sign.

Image Classification object-detection +2

Current State and Future Directions for Learning in Biological Recurrent Neural Networks: A Perspective Piece

no code implementations12 May 2021 Luke Y. Prince, Roy Henha Eyono, Ellen Boven, Arna Ghosh, Joe Pemberton, Franz Scherr, Claudia Clopath, Rui Ponte Costa, Wolfgang Maass, Blake A. Richards, Cristina Savin, Katharina Anna Wilmes

We provide a brief review of the common assumptions about biological learning with findings from experimental neuroscience and contrast them with the efficiency of gradient-based learning in recurrent neural networks.

Reservoirs learn to learn

no code implementations16 Sep 2019 Anand Subramoney, Franz Scherr, Wolfgang Maass

We wondered whether the performance of liquid state machines can be improved if the recurrent weights are chosen with a purpose, rather than randomly.

Eligibility traces provide a data-inspired alternative to backpropagation through time

no code implementations NeurIPS Workshop Neuro_AI 2019 Guillaume Bellec, Franz Scherr, Elias Hajek, Darjan Salaj, Anand Subramoney, Robert Legenstein, Wolfgang Maass

Learning in recurrent neural networks (RNNs) is most often implemented by gradient descent using backpropagation through time (BPTT), but BPTT does not model accurately how the brain learns.

speech-recognition Speech Recognition

Neuromorphic Hardware learns to learn

no code implementations15 Mar 2019 Thomas Bohnstingl, Franz Scherr, Christian Pehle, Karlheinz Meier, Wolfgang Maass

In contrast, the hyperparameters and learning algorithms of networks of neurons in the brain, which they aim to emulate, have been optimized through extensive evolutionary and developmental processes for specific ranges of computing and learning tasks.

Biologically inspired alternatives to backpropagation through time for learning in recurrent neural nets

3 code implementations25 Jan 2019 Guillaume Bellec, Franz Scherr, Elias Hajek, Darjan Salaj, Robert Legenstein, Wolfgang Maass

This lack of understanding is linked to a lack of learning algorithms for recurrent networks of spiking neurons (RSNNs) that are both functionally powerful and can be implemented by known biological mechanisms.

Cannot find the paper you are looking for? You can Submit a new open access paper.