Search Results for author: Gert Cauwenberghs

Found 14 papers, 0 papers with code

Edge AI without Compromise: Efficient, Versatile and Accurate Neurocomputing in Resistive Random-Access Memory

no code implementations17 Aug 2021 Weier Wan, Rajkumar Kubendran, Clemens Schaefer, S. Burc Eryilmaz, Wenqiang Zhang, Dabin Wu, Stephen Deiss, Priyanka Raina, He Qian, Bin Gao, Siddharth Joshi, Huaqiang Wu, H. -S. Philip Wong, Gert Cauwenberghs

Realizing today's cloud-level artificial intelligence functionalities directly on devices distributed at the edge of the internet calls for edge hardware capable of processing multiple modalities of sensory data (e. g. video, audio) at unprecedented energy-efficiency.

Image Classification Image Reconstruction

Large-Scale Neuromorphic Spiking Array Processors: A quest to mimic the brain

no code implementations23 May 2018 Chetan Singh Thakur, Jamal Molin, Gert Cauwenberghs, Giacomo Indiveri, Kundan Kumar, Ning Qiao, Johannes Schemmel, Runchun Wang, Elisabetta Chicca, Jennifer Olson Hasler, Jae-sun Seo, Shimeng Yu, Yu Cao, André van Schaik, Ralph Etienne-Cummings

Neuromorphic engineering (NE) encompasses a diverse range of approaches to information processing that are inspired by neurobiological systems, and this feature distinguishes neuromorphic systems from conventional computing systems.

Deep supervised learning using local errors

no code implementations17 Nov 2017 Hesham Mostafa, Vishwajith Ramesh, Gert Cauwenberghs

Updating the features or weights in one layer, however, requires waiting for the propagation of error signals from higher layers.

A learning framework for winner-take-all networks with stochastic synapses

no code implementations14 Aug 2017 Hesham Mostafa, Gert Cauwenberghs

This allows us to use the proposed networks in a variational learning setting where stochastic backpropagation is used to optimize a lower bound on the data log likelihood, thereby learning a generative model of the data.

Hardware-efficient on-line learning through pipelined truncated-error backpropagation in binary-state networks

no code implementations15 Jun 2017 Hesham Mostafa, Bruno Pedroni, Sadique Sheik, Gert Cauwenberghs

In this paper, we describe a hardware-efficient on-line learning technique for feedforward multi-layer ANNs that is based on pipelined backpropagation.

Membrane-Dependent Neuromorphic Learning Rule for Unsupervised Spike Pattern Detection

no code implementations5 Jan 2017 Sadique Sheik, Somnath Paul, Charles Augustine, Gert Cauwenberghs

Several learning rules for synaptic plasticity, that depend on either spike timing or internal state variables, have been proposed in the past imparting varying computational capabilities to Spiking Neural Networks.

Training a Probabilistic Graphical Model with Resistive Switching Electronic Synapses

no code implementations27 Sep 2016 S. Burc Eryilmaz, Emre Neftci, Siddharth Joshi, Sang-Bum Kim, Matthew BrightSky, Hsiang-Lan Lung, Chung Lam, Gert Cauwenberghs, H. -S. Philip Wong

Current large scale implementations of deep learning and data mining require thousands of processors, massive amounts of off-chip memory, and consume gigajoules of energy.

Forward Table-Based Presynaptic Event-Triggered Spike-Timing-Dependent Plasticity

no code implementations11 Jul 2016 Bruno U. Pedroni, Sadique Sheik, Siddharth Joshi, Georgios Detorakis, Somnath Paul, Charles Augustine, Emre Neftci, Gert Cauwenberghs

We present a novel method for realizing both causal and acausal weight updates using only forward lookup access of the synaptic connectivity table, permitting memory-efficient implementation.

Stochastic Synapses Enable Efficient Brain-Inspired Learning Machines

no code implementations14 Nov 2015 Emre O. Neftci, Bruno U. Pedroni, Siddharth Joshi, Maruan Al-Shedivat, Gert Cauwenberghs

Recent studies have shown that synaptic unreliability is a robust and sufficient mechanism for inducing the stochasticity observed in cortex.

Gibbs Sampling with Low-Power Spiking Digital Neurons

no code implementations26 Mar 2015 Srinjoy Das, Bruno Umbria Pedroni, Paul Merolla, John Arthur, Andrew S. Cassidy, Bryan L. Jackson, Dharmendra Modha, Gert Cauwenberghs, Ken Kreutz-Delgado

Restricted Boltzmann Machines and Deep Belief Networks have been successfully used in a wide variety of applications including image classification and speech recognition.

General Classification Image Classification +1

Learning Non-deterministic Representations with Energy-based Ensembles

no code implementations23 Dec 2014 Maruan Al-Shedivat, Emre Neftci, Gert Cauwenberghs

These mappings are encoded in a distribution over a (possibly infinite) collection of models.

One-Shot Learning

Event-Driven Contrastive Divergence for Spiking Neuromorphic Systems

no code implementations5 Nov 2013 Emre Neftci, Srinjoy Das, Bruno Pedroni, Kenneth Kreutz-Delgado, Gert Cauwenberghs

However the traditional RBM architecture and the commonly used training algorithm known as Contrastive Divergence (CD) are based on discrete updates and exact arithmetics which do not directly map onto a dynamical neural substrate.

Dimensionality Reduction

Cannot find the paper you are looking for? You can Submit a new open access paper.