Associative content-addressable networks with exponentially many robust stable states

6 Apr 2017  ·  Rishidev Chaudhuri, Ila Fiete ·

The brain must robustly store a large number of memories, corresponding to the many events encountered over a lifetime. However, the number of memory states in existing neural network models either grows weakly with network size or recall fails catastrophically with vanishingly little noise. We construct an associative content-addressable memory with exponentially many stable states and robust error-correction. The network possesses expander graph connectivity on a restricted Boltzmann machine architecture. The expansion property allows simple neural network dynamics to perform at par with modern error-correcting codes. Appropriate networks can be constructed with sparse random connections, glomerular nodes, and associative learning using low dynamic-range weights. Thus, sparse quasi-random structures---characteristic of important error-correcting codes---may provide for high-performance computation in artificial neural networks and the brain.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods