Robust and accelerated single-spike spiking neural network training with applicability to challenging temporal tasks

30 May 2022  ·  Luke Taylor, Andrew King, Nicol Harper ·

Spiking neural networks (SNNs), particularly the single-spike variant in which neurons spike at most once, are considerably more energy efficient than standard artificial neural networks (ANNs). However, single-spike SSNs are difficult to train due to their dynamic and non-differentiable nature, where current solutions are either slow or suffer from training instabilities. These networks have also been critiqued for their limited computational applicability such as being unsuitable for time-series datasets. We propose a new model for training single-spike SNNs which mitigates the aforementioned training issues and obtains competitive results across various image and neuromorphic datasets, with up to a $13.98\times$ training speedup and up to an $81\%$ reduction in spikes compared to the multi-spike SNN. Notably, our model performs on par with multi-spike SNNs in challenging tasks involving neuromorphic time-series datasets, demonstrating a broader computational role for single-spike SNNs than previously believed.

PDF Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Image Classification Fashion-MNIST FastSNN (CNN) Accuracy 90.57 # 8
Image Classification Fashion-MNIST FastSNN (MLP) Accuracy 89.05 # 9
Image Classification MNIST FastSNN (CNN) Accuracy 99.3 # 18
Image Classification MNIST FastSNN (MLP) Accuracy 97.91 # 27
Image Classification N-MNIST FastSNN Accuracy 95.91 # 3
Audio Classification SHD FastSNN Percentage correct 70.32 # 10

Methods