Search Results for author: Gert Cauwenberghs

Found 21 papers, 1 papers with code

Event-Driven Contrastive Divergence for Spiking Neuromorphic Systems

no code implementations5 Nov 2013 Emre Neftci, Srinjoy Das, Bruno Pedroni, Kenneth Kreutz-Delgado, Gert Cauwenberghs

However the traditional RBM architecture and the commonly used training algorithm known as Contrastive Divergence (CD) are based on discrete updates and exact arithmetics which do not directly map onto a dynamical neural substrate.

Dimensionality Reduction

Learning Non-deterministic Representations with Energy-based Ensembles

no code implementations23 Dec 2014 Maruan Al-Shedivat, Emre Neftci, Gert Cauwenberghs

These mappings are encoded in a distribution over a (possibly infinite) collection of models.

One-Shot Learning

Gibbs Sampling with Low-Power Spiking Digital Neurons

no code implementations26 Mar 2015 Srinjoy Das, Bruno Umbria Pedroni, Paul Merolla, John Arthur, Andrew S. Cassidy, Bryan L. Jackson, Dharmendra Modha, Gert Cauwenberghs, Ken Kreutz-Delgado

Restricted Boltzmann Machines and Deep Belief Networks have been successfully used in a wide variety of applications including image classification and speech recognition.

General Classification Image Classification +2

Stochastic Synapses Enable Efficient Brain-Inspired Learning Machines

no code implementations14 Nov 2015 Emre O. Neftci, Bruno U. Pedroni, Siddharth Joshi, Maruan Al-Shedivat, Gert Cauwenberghs

Recent studies have shown that synaptic unreliability is a robust and sufficient mechanism for inducing the stochasticity observed in cortex.

Forward Table-Based Presynaptic Event-Triggered Spike-Timing-Dependent Plasticity

no code implementations11 Jul 2016 Bruno U. Pedroni, Sadique Sheik, Siddharth Joshi, Georgios Detorakis, Somnath Paul, Charles Augustine, Emre Neftci, Gert Cauwenberghs

We present a novel method for realizing both causal and acausal weight updates using only forward lookup access of the synaptic connectivity table, permitting memory-efficient implementation.

Training a Probabilistic Graphical Model with Resistive Switching Electronic Synapses

no code implementations27 Sep 2016 S. Burc Eryilmaz, Emre Neftci, Siddharth Joshi, Sang-Bum Kim, Matthew BrightSky, Hsiang-Lan Lung, Chung Lam, Gert Cauwenberghs, H. -S. Philip Wong

Current large scale implementations of deep learning and data mining require thousands of processors, massive amounts of off-chip memory, and consume gigajoules of energy.

Membrane-Dependent Neuromorphic Learning Rule for Unsupervised Spike Pattern Detection

no code implementations5 Jan 2017 Sadique Sheik, Somnath Paul, Charles Augustine, Gert Cauwenberghs

Several learning rules for synaptic plasticity, that depend on either spike timing or internal state variables, have been proposed in the past imparting varying computational capabilities to Spiking Neural Networks.

Hardware-efficient on-line learning through pipelined truncated-error backpropagation in binary-state networks

no code implementations15 Jun 2017 Hesham Mostafa, Bruno Pedroni, Sadique Sheik, Gert Cauwenberghs

In this paper, we describe a hardware-efficient on-line learning technique for feedforward multi-layer ANNs that is based on pipelined backpropagation.

A learning framework for winner-take-all networks with stochastic synapses

no code implementations14 Aug 2017 Hesham Mostafa, Gert Cauwenberghs

This allows us to use the proposed networks in a variational learning setting where stochastic backpropagation is used to optimize a lower bound on the data log likelihood, thereby learning a generative model of the data.

Deep supervised learning using local errors

no code implementations17 Nov 2017 Hesham Mostafa, Vishwajith Ramesh, Gert Cauwenberghs

Updating the features or weights in one layer, however, requires waiting for the propagation of error signals from higher layers.

Large-Scale Neuromorphic Spiking Array Processors: A quest to mimic the brain

no code implementations23 May 2018 Chetan Singh Thakur, Jamal Molin, Gert Cauwenberghs, Giacomo Indiveri, Kundan Kumar, Ning Qiao, Johannes Schemmel, Runchun Wang, Elisabetta Chicca, Jennifer Olson Hasler, Jae-sun Seo, Shimeng Yu, Yu Cao, André van Schaik, Ralph Etienne-Cummings

Neuromorphic engineering (NE) encompasses a diverse range of approaches to information processing that are inspired by neurobiological systems, and this feature distinguishes neuromorphic systems from conventional computing systems.

Edge AI without Compromise: Efficient, Versatile and Accurate Neurocomputing in Resistive Random-Access Memory

no code implementations17 Aug 2021 Weier Wan, Rajkumar Kubendran, Clemens Schaefer, S. Burc Eryilmaz, Wenqiang Zhang, Dabin Wu, Stephen Deiss, Priyanka Raina, He Qian, Bin Gao, Siddharth Joshi, Huaqiang Wu, H. -S. Philip Wong, Gert Cauwenberghs

Realizing today's cloud-level artificial intelligence functionalities directly on devices distributed at the edge of the internet calls for edge hardware capable of processing multiple modalities of sensory data (e. g. video, audio) at unprecedented energy-efficiency.

Image Classification Image Reconstruction

NeuroBench: A Framework for Benchmarking Neuromorphic Computing Algorithms and Systems

1 code implementation10 Apr 2023 Jason Yik, Korneel Van den Berghe, Douwe den Blanken, Younes Bouhadjar, Maxime Fabre, Paul Hueber, Denis Kleyko, Noah Pacik-Nelson, Pao-Sheng Vincent Sun, Guangzhi Tang, Shenqi Wang, Biyan Zhou, Soikat Hasan Ahmed, George Vathakkattil Joseph, Benedetto Leto, Aurora Micheli, Anurag Kumar Mishra, Gregor Lenz, Tao Sun, Zergham Ahmed, Mahmoud Akl, Brian Anderson, Andreas G. Andreou, Chiara Bartolozzi, Arindam Basu, Petrut Bogdan, Sander Bohte, Sonia Buckley, Gert Cauwenberghs, Elisabetta Chicca, Federico Corradi, Guido de Croon, Andreea Danielescu, Anurag Daram, Mike Davies, Yigit Demirag, Jason Eshraghian, Tobias Fischer, Jeremy Forest, Vittorio Fra, Steve Furber, P. Michael Furlong, William Gilpin, Aditya Gilra, Hector A. Gonzalez, Giacomo Indiveri, Siddharth Joshi, Vedant Karia, Lyes Khacef, James C. Knight, Laura Kriener, Rajkumar Kubendran, Dhireesha Kudithipudi, Yao-Hong Liu, Shih-Chii Liu, Haoyuan Ma, Rajit Manohar, Josep Maria Margarit-Taulé, Christian Mayr, Konstantinos Michmizos, Dylan Muir, Emre Neftci, Thomas Nowotny, Fabrizio Ottati, Ayca Ozcelikkale, Priyadarshini Panda, Jongkil Park, Melika Payvand, Christian Pehle, Mihai A. Petrovici, Alessandro Pierro, Christoph Posch, Alpha Renner, Yulia Sandamirskaya, Clemens JS Schaefer, André van Schaik, Johannes Schemmel, Samuel Schmidgall, Catherine Schuman, Jae-sun Seo, Sadique Sheik, Sumit Bam Shrestha, Manolis Sifalakis, Amos Sironi, Matthew Stewart, Kenneth Stewart, Terrence C. Stewart, Philipp Stratmann, Jonathan Timcheck, Nergis Tömen, Gianvito Urgese, Marian Verhelst, Craig M. Vineyard, Bernhard Vogginger, Amirreza Yousefzadeh, Fatima Tuz Zohora, Charlotte Frenkel, Vijay Janapa Reddi

The NeuroBench framework introduces a common set of tools and systematic methodology for inclusive benchmark measurement, delivering an objective reference framework for quantifying neuromorphic approaches in both hardware-independent (algorithm track) and hardware-dependent (system track) settings.

Benchmarking

Active Inference in Hebbian Learning Networks

no code implementations8 Jun 2023 Ali Safa, Tim Verbelen, Lars Keuninckx, Ilja Ocket, André Bourdoux, Francky Catthoor, Georges Gielen, Gert Cauwenberghs

This work studies how brain-inspired neural ensembles equipped with local Hebbian plasticity can perform active inference (AIF) in order to control dynamical agents.

OpenAI Gym Q-Learning

Integrating LLM, EEG, and Eye-Tracking Biomarker Analysis for Word-Level Neural State Classification in Semantic Inference Reading Comprehension

no code implementations27 Sep 2023 Yuhong Zhang, Qin Li, Sujal Nahata, Tasnia Jamal, Shih-kuen Cheng, Gert Cauwenberghs, Tzyy-Ping Jung

With the recent proliferation of large language models (LLMs), such as Generative Pre-trained Transformers (GPT), there has been a significant shift in exploring human and machine comprehension of semantic language meaning.

EEG Feature Engineering +1

Towards Chip-in-the-loop Spiking Neural Network Training via Metropolis-Hastings Sampling

no code implementations9 Feb 2024 Ali Safa, Vikrant Jaltare, Samira Sebt, Kameron Gano, Johannes Leugering, Georges Gielen, Gert Cauwenberghs

Our results show that the proposed approach strongly outperforms the use of backprop by up to $27\%$ higher accuracy when subject to strong hardware non-idealities.

Energy-efficiency Limits on Training AI Systems using Learning-in-Memory

no code implementations21 Feb 2024 Zihao Chen, Johannes Leugering, Gert Cauwenberghs, Shantanu Chakrabartty

In this paper, we derive new theoretical lower bounds on energy dissipation when training AI systems using different LIM approaches.

Cannot find the paper you are looking for? You can Submit a new open access paper.