Resonator Networks outperform optimization methods at solving high-dimensional vector factorization

19 Jun 2019  ·  Spencer J. Kent, E. Paxon Frady, Friedrich T. Sommer, Bruno A. Olshausen ·

We develop theoretical foundations of Resonator Networks, a new type of recurrent neural network introduced in Frady et al. (2020) to solve a high-dimensional vector factorization problem arising in Vector Symbolic Architectures. Given a composite vector formed by the Hadamard product between a discrete set of high-dimensional vectors, a Resonator Network can efficiently decompose the composite into these factors. We compare the performance of Resonator Networks against optimization-based methods, including Alternating Least Squares and several gradient-based algorithms, showing that Resonator Networks are superior in several important ways. This advantage is achieved by leveraging a combination of nonlinear dynamics and "searching in superposition," by which estimates of the correct solution are formed from a weighted superposition of all possible solutions. While the alternative methods also search in superposition, the dynamics of Resonator Networks allow them to strike a more effective balance between exploring the solution space and exploiting local information to drive the network toward probable solutions. Resonator Networks are not guaranteed to converge, but within a particular regime they almost always do. In exchange for relaxing this guarantee of global convergence, Resonator Networks are dramatically more effective at finding factorizations than all alternative approaches considered.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here