1 code implementation • 17 Oct 2022 • Franz Scherr, Qinghai Guo, Timoleon Moraitis
Specifically, the brain also transforms the environment through efference, i. e. motor commands, however it sends to itself an EC of the full commands, i. e. more than a mere SSL sign.
no code implementations • 12 May 2021 • Luke Y. Prince, Roy Henha Eyono, Ellen Boven, Arna Ghosh, Joe Pemberton, Franz Scherr, Claudia Clopath, Rui Ponte Costa, Wolfgang Maass, Blake A. Richards, Cristina Savin, Katharina Anna Wilmes
We provide a brief review of the common assumptions about biological learning with findings from experimental neuroscience and contrast them with the efficiency of gradient-based learning in recurrent neural networks.
no code implementations • 16 Sep 2019 • Anand Subramoney, Franz Scherr, Wolfgang Maass
We wondered whether the performance of liquid state machines can be improved if the recurrent weights are chosen with a purpose, rather than randomly.
no code implementations • NeurIPS Workshop Neuro_AI 2019 • Guillaume Bellec, Franz Scherr, Elias Hajek, Darjan Salaj, Anand Subramoney, Robert Legenstein, Wolfgang Maass
Learning in recurrent neural networks (RNNs) is most often implemented by gradient descent using backpropagation through time (BPTT), but BPTT does not model accurately how the brain learns.
no code implementations • 15 Mar 2019 • Thomas Bohnstingl, Franz Scherr, Christian Pehle, Karlheinz Meier, Wolfgang Maass
In contrast, the hyperparameters and learning algorithms of networks of neurons in the brain, which they aim to emulate, have been optimized through extensive evolutionary and developmental processes for specific ranges of computing and learning tasks.
3 code implementations • 25 Jan 2019 • Guillaume Bellec, Franz Scherr, Elias Hajek, Darjan Salaj, Robert Legenstein, Wolfgang Maass
This lack of understanding is linked to a lack of learning algorithms for recurrent networks of spiking neurons (RSNNs) that are both functionally powerful and can be implemented by known biological mechanisms.