Regulation toward Self-organized Criticality in a Recurrent Spiking Neural Reservoir

Generating stable yet performant spiking neural reservoirs for classification applications is still an open issue. This is due to the extremely non-linear dynamics of recurrent spiking neural networks. In this perspective, a local and unsupervised learning rule that tunes the reservoir toward self-organized criticality is proposed, and applied to networks of leaky integrate-and-fire neurons with random and small-world topologies. Longer sustained activity for both topologies was elicited after learning compared to spectral radius normalization (global rescaling scheme). The ability to control the desired regime of the reservoir was shown and quick convergence toward it was observed for speech signals. Proposed regulation method can be applied online and leads to reservoirs more strongly adapted to the task at hand.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here