Encoding Event-Based Data With a Hybrid SNN Guided Variational Auto-encoder in Neuromorphic Hardware

31 Mar 2021  ·  Kenneth Stewart, Andreea Danielescu, Timothy Shea, Emre Neftci ·

Neuromorphic hardware equipped with learning capabilities can adapt to new, real-time data. While models of Spiking Neural Networks (SNNs) can now be trained using gradient descent to reach an accuracy comparable to equivalent conventional neural networks, such learning often relies on external labels. However, real-world data is unlabeled which can make supervised methods inapplicable. To solve this problem, we propose a Hybrid Guided Variational Autoencoder (VAE) which encodes event based data sensed by a Dynamic Vision Sensor (DVS) into a latent space representation using an SNN. These representations can be used as an embedding to measure data similarity and predict labels in real-world data. We show that the Hybrid Guided-VAE achieves 87% classification accuracy on the DVSGesture dataset and it can encode the sparse, noisy inputs into an interpretable latent space representation, visualized through T-SNE plots. We also implement the encoder component of the model on neuromorphic hardware and discuss the potential for our algorithm to enable real-time learning from real-world event data.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods