Search Results for author: Alexander Seeholzer

Found 2 papers, 1 papers with code

Working memory facilitates reward-modulated Hebbian learning in recurrent neural networks

1 code implementation NeurIPS Workshop Neuro_AI 2019 Roman Pogodin, Dane Corneil, Alexander Seeholzer, Joseph Heng, Wulfram Gerstner

Reservoir computing is a powerful tool to explain how the brain learns temporal sequences, such as movements, but existing learning schemes are either biologically implausible or too inefficient to explain animal performance.

Temporal Sequences

Algorithmic Composition of Melodies with Deep Recurrent Neural Networks

no code implementations23 Jun 2016 Florian Colombo, Samuel P. Muscinelli, Alexander Seeholzer, Johanni Brea, Wulfram Gerstner

A big challenge in algorithmic composition is to devise a model that is both easily trainable and able to reproduce the long-range temporal dependencies typical of music.

Cannot find the paper you are looking for? You can Submit a new open access paper.