Search Results for author: Laura Kriener

Found 12 papers, 8 papers with code

The Role of Temporal Hierarchy in Spiking Neural Networks

no code implementations26 Jul 2024 Filippo Moro, Pau Vilimelis Aceituno, Laura Kriener, Melika Payvand

The temporal dynamics such as time constants of the synapses and neurons and delays have been recently shown to have computational benefits that help reduce the overall number of parameters required in the network and increase the accuracy of the SNNs in solving temporal tasks.

Inductive Bias Keyword Spotting

DelGrad: Exact event-based gradients in spiking networks for training delays and weights

1 code implementation30 Apr 2024 Julian Göltz, Jimmy Weber, Laura Kriener, Sebastian Billaudelle, Peter Lake, Johannes Schemmel, Melika Payvand, Mihai A. Petrovici

For the first time, we experimentally demonstrate the memory efficiency and accuracy benefits of adding delays to SNNs on noisy mixed-signal hardware.

Backpropagation through space, time, and the brain

no code implementations25 Mar 2024 Benjamin Ellenberger, Paul Haider, Jakob Jordan, Kevin Max, Ismael Jaras, Laura Kriener, Federico Benitez, Mihai A. Petrovici

The resulting dynamics can be interpreted as a real-time, biologically plausible approximation of backpropagation through space and time in deep cortical networks with continuous-time neuronal dynamics and continuously active, local synaptic plasticity.

ELiSe: Efficient Learning of Sequences in Structured Recurrent Networks

1 code implementation26 Feb 2024 Laura Kriener, Kristin Völk, Ben von Hünerbein, Federico Benitez, Walter Senn, Mihai A. Petrovici

Our resulting model for Efficient Learning of Sequences (ELiSe) builds on these features to acquire and replay complex non-Markovian spatio-temporal patterns using only local, always-on and phase-free synaptic plasticity.

Gradient-based methods for spiking physical systems

no code implementations29 Aug 2023 Julian Göltz, Sebastian Billaudelle, Laura Kriener, Luca Blessing, Christian Pehle, Eric Müller, Johannes Schemmel, Mihai A. Petrovici

Recent efforts have fostered significant progress towards deep learning in spiking networks, both theoretical and in silico.

Deep Learning

NeuroBench: A Framework for Benchmarking Neuromorphic Computing Algorithms and Systems

1 code implementation10 Apr 2023 Jason Yik, Korneel Van den Berghe, Douwe den Blanken, Younes Bouhadjar, Maxime Fabre, Paul Hueber, Weijie Ke, Mina A Khoei, Denis Kleyko, Noah Pacik-Nelson, Alessandro Pierro, Philipp Stratmann, Pao-Sheng Vincent Sun, Guangzhi Tang, Shenqi Wang, Biyan Zhou, Soikat Hasan Ahmed, George Vathakkattil Joseph, Benedetto Leto, Aurora Micheli, Anurag Kumar Mishra, Gregor Lenz, Tao Sun, Zergham Ahmed, Mahmoud Akl, Brian Anderson, Andreas G. Andreou, Chiara Bartolozzi, Arindam Basu, Petrut Bogdan, Sander Bohte, Sonia Buckley, Gert Cauwenberghs, Elisabetta Chicca, Federico Corradi, Guido de Croon, Andreea Danielescu, Anurag Daram, Mike Davies, Yigit Demirag, Jason Eshraghian, Tobias Fischer, Jeremy Forest, Vittorio Fra, Steve Furber, P. Michael Furlong, William Gilpin, Aditya Gilra, Hector A. Gonzalez, Giacomo Indiveri, Siddharth Joshi, Vedant Karia, Lyes Khacef, James C. Knight, Laura Kriener, Rajkumar Kubendran, Dhireesha Kudithipudi, Shih-Chii Liu, Yao-Hong Liu, Haoyuan Ma, Rajit Manohar, Josep Maria Margarit-Taulé, Christian Mayr, Konstantinos Michmizos, Dylan R. Muir, Emre Neftci, Thomas Nowotny, Fabrizio Ottati, Ayca Ozcelikkale, Priyadarshini Panda, Jongkil Park, Melika Payvand, Christian Pehle, Mihai A. Petrovici, Christoph Posch, Alpha Renner, Yulia Sandamirskaya, Clemens JS Schaefer, André van Schaik, Johannes Schemmel, Samuel Schmidgall, Catherine Schuman, Jae-sun Seo, Sadique Sheik, Sumit Bam Shrestha, Manolis Sifalakis, Amos Sironi, Kenneth Stewart, Matthew Stewart, Terrence C. Stewart, Jonathan Timcheck, Nergis Tömen, Gianvito Urgese, Marian Verhelst, Craig M. Vineyard, Bernhard Vogginger, Amirreza Yousefzadeh, Fatima Tuz Zohora, Charlotte Frenkel, Vijay Janapa Reddi

To address these shortcomings, we present NeuroBench: a benchmark framework for neuromorphic computing algorithms and systems.

Benchmarking

Learning efficient backprojections across cortical hierarchies in real time

1 code implementation20 Dec 2022 Kevin Max, Laura Kriener, Garibaldi Pineda García, Thomas Nowotny, Ismael Jaras, Walter Senn, Mihai A. Petrovici

Models of sensory processing and learning in the cortex need to efficiently assign credit to synapses in all areas.

The Yin-Yang dataset

1 code implementation16 Feb 2021 Laura Kriener, Julian Göltz, Mihai A. Petrovici

The Yin-Yang dataset was developed for research on biologically plausible error backpropagation and deep learning in spiking neural networks.

Deep Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.