Search Results for author: Bipin Rajendran

Found 17 papers, 1 papers with code

Fast On-Device Adaptation for Spiking Neural Networks via Online-Within-Online Meta-Learning

no code implementations21 Feb 2021 Bleema Rosenfeld, Bipin Rajendran, Osvaldo Simeone

Spiking Neural Networks (SNNs) have recently gained popularity as machine learning models for on-device edge intelligence for applications such as mobile healthcare management and natural language processing due to their low power profile.


Hybrid In-memory Computing Architecture for the Training of Deep Neural Networks

no code implementations10 Feb 2021 Vinay Joshi, Wangxin He, Jae-sun Seo, Bipin Rajendran

We propose a hybrid in-memory computing (HIC) architecture for the training of DNNs on hardware accelerators that results in memory-efficient inference and outperforms baseline software accuracy in benchmark tasks.

SpinAPS: A High-Performance Spintronic Accelerator for Probabilistic Spiking Neural Networks

no code implementations5 Aug 2020 Anakha V Babu, Osvaldo Simeone, Bipin Rajendran

We discuss a high-performance and high-throughput hardware accelerator for probabilistic Spiking Neural Networks (SNNs) based on Generalized Linear Model (GLM) neurons, that uses binary STT-RAM devices as synapses and digital CMOS logic for neurons.

Activity Recognition

ESSOP: Efficient and Scalable Stochastic Outer Product Architecture for Deep Learning

no code implementations25 Mar 2020 Vinay Joshi, Geethan Karunaratne, Manuel Le Gallo, Irem Boybat, Christophe Piveteau, Abu Sebastian, Bipin Rajendran, Evangelos Eleftheriou

Strategies to improve the efficiency of MVM computation in hardware have been demonstrated with minimal impact on training accuracy.

Accurate deep neural network inference using computational phase-change memory

no code implementations7 Jun 2019 Vinay Joshi, Manuel Le Gallo, Irem Boybat, Simon Haefeli, Christophe Piveteau, Martino Dazzi, Bipin Rajendran, Abu Sebastian, Evangelos Eleftheriou

In-memory computing is a promising non-von Neumann approach where certain computational tasks are performed within memory units by exploiting the physical attributes of memory devices.

Emerging Technologies

Supervised Learning in Spiking Neural Networks with Phase-Change Memory Synapses

no code implementations28 May 2019 S. R. Nandakumar, Irem Boybat, Manuel Le Gallo, Evangelos Eleftheriou, Abu Sebastian, Bipin Rajendran

Combining the computational potential of supervised SNNs with the parallel compute power of computational memory, the work paves the way for next-generation of efficient brain-inspired systems.

Low-Power Neuromorphic Hardware for Signal Processing Applications

no code implementations11 Jan 2019 Bipin Rajendran, Abu Sebastian, Michael Schmuker, Narayan Srinivasa, Evangelos Eleftheriou

In this paper, we review some of the architectural and system level design aspects involved in developing a new class of brain-inspired information processing engines that mimic the time-based information encoding and processing aspects of the brain.

Training Multi-layer Spiking Neural Networks using NormAD based Spatio-Temporal Error Backpropagation

no code implementations23 Oct 2018 Navin Anwani, Bipin Rajendran

To tackle this, first the problem of training a multi-layer SNN is formulated as an optimization problem such that its objective function is based on the deviation in membrane potential rather than the spike arrival instants.

Learning First-to-Spike Policies for Neuromorphic Control Using Policy Gradients

no code implementations23 Oct 2018 Bleema Rosenfeld, Osvaldo Simeone, Bipin Rajendran

In this work, the use of SNNs as stochastic policies is explored under an energy-efficient first-to-spike action rule, whereby the action taken by the RL agent is determined by the occurrence of the first spike among the output neurons.

Adversarial Training for Probabilistic Spiking Neural Networks

no code implementations22 Feb 2018 Alireza Bagheri, Osvaldo Simeone, Bipin Rajendran

Due to the prominence of Artificial Neural Networks (ANNs) as classifiers, their sensitivity to adversarial examples, as well as robust training schemes, have been recently the subject of intense investigation.

Learning and Real-time Classification of Hand-written Digits With Spiking Neural Networks

no code implementations9 Nov 2017 Shruti R. Kulkarni, John M. Alexiades, Bipin Rajendran

On the standard MNIST database images of handwritten digits, our network achieves an accuracy of 99. 80% on the training set and 98. 06% on the test set, with nearly 7x fewer parameters compared to the state-of-the-art spiking networks.

General Classification

Stochastic Deep Learning in Memristive Networks

no code implementations9 Nov 2017 Anakha V Babu, Bipin Rajendran

We also study the performance of stochastic memristive DNNs when used as inference engines with noise corrupted data and find that if the device variability can be minimized, the relative degradation in performance for the Stochastic DNN is better than that of the software baseline.

Training Probabilistic Spiking Neural Networks with First-to-spike Decoding

1 code implementation29 Oct 2017 Alireza Bagheri, Osvaldo Simeone, Bipin Rajendran

Third-generation neural networks, or Spiking Neural Networks (SNNs), aim at harnessing the energy efficiency of spike-domain processing by building on computing elements that operate on, and exchange, spikes.

Classification General Classification

Efficient and Robust Spiking Neural Circuit for Navigation Inspired by Echolocating Bats

no code implementations NeurIPS 2016 Pulkit Tandon, Yash H. Malviya, Bipin Rajendran

We demonstrate a spiking neural circuit for azimuth angle detection inspired by the echolocation circuits of the Horseshoe bat Rhinolophus ferrumequinum and utilize it to devise a model for navigation and target tracking, capturing several key aspects of information transmission in biology.

Sub-threshold CMOS Spiking Neuron Circuit Design for Navigation Inspired by C. elegans Chemotaxis

no code implementations29 Oct 2014 Shibani Santurkar, Bipin Rajendran

We demonstrate a spiking neural network for navigation motivated by the chemotaxis network of Caenorhabditis elegans.

A neural circuit for navigation inspired by C. elegans Chemotaxis

no code implementations29 Oct 2014 Shibani Santurkar, Bipin Rajendran

In order to harness the computational advantages spiking neural networks promise over their non-spiking counterparts, we develop a network comprising 7-spiking neurons with non-plastic synapses which we show is extremely robust in tracking a range of concentrations.

Cannot find the paper you are looking for? You can Submit a new open access paper.