Sparse Spiking Gradient Descent

There is an increasing interest in emulating Spiking Neural Networks (SNNs) on neuromorphic computing devices due to their low energy consumption. Recent advances have allowed training SNNs to a point where they start to compete with traditional Artificial Neural Networks (ANNs) in terms of accuracy, while at the same time being energy efficient when run on neuromorphic hardware. However, the process of training SNNs is still based on dense tensor operations originally developed for ANNs which do not leverage the spatiotemporally sparse nature of SNNs. We present here the first sparse SNN backpropagation algorithm which achieves the same or better accuracy as current state of the art methods while being significantly faster and more memory efficient. We show the effectiveness of our method on real datasets of varying complexity (Fashion-MNIST, Neuromophic-MNIST and Spiking Heidelberg Digits) achieving a speedup in the backward pass of up to 150x, and 85% more memory efficient, without losing accuracy.

PDF Abstract NeurIPS 2021 PDF NeurIPS 2021 Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Image Classification Fashion-MNIST Sparse Spiking Gradient Descent (CNN) Accuracy 86.7 # 11
Image Classification Fashion-MNIST Sparse Spiking Gradient Descent (MLP) Accuracy 82.7 # 12
Image Classification N-MNIST Sparse Spiking Gradient Descent Accuracy 92.7 # 4
Audio Classification SHD Sparse Spiking Gradient Descent Percentage correct 77.5 # 9

Methods


No methods listed for this paper. Add relevant methods here