no code implementations • 17 Jun 2022 • Narsimha Chilkuri, Chris Eliasmith
In this report we consider the following problem: Given a trained model that is partially faulty, can we correct its behaviour without having to train the model from scratch?
no code implementations • 5 Oct 2021 • Narsimha Chilkuri, Eric Hunsberger, Aaron Voelker, Gurshaant Malik, Chris Eliasmith
Over three orders of magnitude, we show that our new architecture attains the same accuracy as transformers with 10x fewer tokens.
no code implementations • 16 Jun 2021 • Kinjal Patel, Eric Hunsberger, Sean Batir, Chris Eliasmith
We explore the advantages of regularizing firing rates of Loihi neurons for converting ANN to SNN with minimum accuracy loss and optimized energy consumption.
2 code implementations • 22 Feb 2021 • Narsimha Chilkuri, Chris Eliasmith
For instance, our LMU sets a new state-of-the-art result on psMNIST, and uses half the parameters while outperforming DistilBERT and LSTM models on IMDB sentiment analysis.
Ranked #6 on Sequential Image Classification on Sequential MNIST
no code implementations • 1 Jan 2021 • Narsimha Reddy Chilkuri, Chris Eliasmith
Our models, despite their simplicity, achieve new state-of-the-art results for RNNs on psMNIST and QQP, and exhibit superior performance on the remaining three datasets while using up to 1000x fewer parameters.
no code implementations • 18 Sep 2020 • Yexin Yan, Terrence C. Stewart, Xuan Choo, Bernhard Vogginger, Johannes Partzsch, Sebastian Hoeppner, Florian Kelber, Chris Eliasmith, Steve Furber, Christian Mayr
We implemented two neural network based benchmark tasks on a prototype chip of the second-generation SpiNNaker (SpiNNaker 2) neuromorphic system: keyword spotting and adaptive robotic control.
no code implementations • 9 Sep 2020 • Peter Blouw, Gurshaant Malik, Benjamin Morcos, Aaron R. Voelker, Chris Eliasmith
Keyword spotting (KWS) provides a critical user interface for many mobile and edge applications, including phones, wearables, and cars.
1 code implementation • 20 Jul 2020 • Travis DeWolf, Pawel Jaworski, Chris Eliasmith
In this paper we demonstrate how the Nengo neural modeling and simulation libraries enable users to quickly develop robotic perception and action neural networks for simulation on neuromorphic hardware using familiar tools, such as Keras and Python.
no code implementations • 10 Feb 2020 • Aaron R. Voelker, Daniel Rasmussen, Chris Eliasmith
The machine learning community has become increasingly interested in the energy efficiency of neural networks.
2 code implementations • NeurIPS 2019 • Aaron Voelker, Ivana Kajić, Chris Eliasmith
Backpropagation through the ODE solver allows each layer to adapt its internal time-step, enabling the network to learn task-relevant time-scales.
Ranked #12 on Sequential Image Classification on Sequential MNIST
2 code implementations • 26 Apr 2019 • Andreas Stöckel, Chris Eliasmith
Nonlinear interactions in the dendritic tree play a key role in neural computation.
1 code implementation • 4 Dec 2018 • Peter Blouw, Xuan Choo, Eric Hunsberger, Chris Eliasmith
Using Intel's Loihi neuromorphic research chip and ABR's Nengo Deep Learning toolkit, we analyze the inference speed, dynamic power consumption, and energy cost per inference of a two-layer neural network keyword spotter trained to recognize a single phrase.
no code implementations • 20 Oct 2017 • Andreas Stöckel, Aaron R. Voelker, Chris Eliasmith
This, in particular, significantly affects the influence of inhibitory signals on the neuronal dynamics.
no code implementations • 27 Aug 2017 • Aaron R. Voelker, Chris Eliasmith
We review our current software tools and theoretical methods for applying the Neural Engineering Framework to state-of-the-art neuromorphic hardware.
1 code implementation • 16 Nov 2016 • Eric Hunsberger, Chris Eliasmith
We describe a method to train spiking deep networks that can be run using leaky integrate-and-fire (LIF) neurons, achieving state-of-the-art results for spiking LIF networks on five datasets, including the large ImageNet ILSVRC-2012 benchmark.
no code implementations • 16 Feb 2016 • Chris Eliasmith, Jan Gosmann, Xuan Choo
We describe a large-scale functional brain model that includes detailed, conductance-based, compartmental models of individual neurons.
2 code implementations • 29 Oct 2015 • Eric Hunsberger, Chris Eliasmith
We train spiking deep networks using leaky integrate-and-fire (LIF) neurons, and achieve state-of-the-art results for spiking networks on the CIFAR-10 and MNIST datasets.
2 code implementations • SCIPY 2014 2014 • Brent Komer, James Bergstra, Chris Eliasmith
Hyperopt-sklearn is a new software project that provides automatic algorithm configuration of the Scikit-learn machine learning library.
no code implementations • NeurIPS 2011 • Julie Dethier, Paul Nuyujukian, Chris Eliasmith, Terrence C. Stewart, Shauki A. Elasaad, Krishna V. Shenoy, Kwabena A. Boahen
The Kalman filter was trained to predict the arm’s velocity and mapped on to the SNN using the Neural Engineer- ing Framework (NEF).