no code implementations • 8 Nov 2023 • Christopher J. Kymn, Denis Kleyko, E. Paxon Frady, Connor Bybee, Pentti Kanerva, Friedrich T. Sommer, Bruno A. Olshausen
We introduce Residue Hyperdimensional Computing, a computing framework that unifies residue number systems with an algebra defined over random, high-dimensional vectors.
no code implementations • 26 May 2023 • Denis Kleyko, Connor Bybee, Ping-Chen Huang, Christopher J. Kymn, Bruno A. Olshausen, E. Paxon Frady, Friedrich T. Sommer
In particular, we find that the decoding techniques from the sparse coding and compressed sensing literature (rarely used for Hyperdimensional Computing/Vector Symbolic Architectures) are also well-suited for decoding information from the compositional distributed representations.
no code implementations • 23 Mar 2023 • E. Paxon Frady, Spencer Kent, Quinn Tran, Pentti Kanerva, Bruno A. Olshausen, Friedrich T. Sommer
In contrast to learning category labels, here we train deep neural networks to output the full compositional vector description of an input image.
no code implementations • 7 Dec 2022 • Connor Bybee, Denis Kleyko, Dmitri E. Nikonov, Amir Khosrowshahi, Bruno A. Olshausen, Friedrich T. Sommer
A prominent approach to solving combinatorial optimization problems on parallel hardware is Ising machines, i. e., hardware implementations of networks of interacting binary spin variables.
no code implementations • 5 Sep 2022 • Alpha Renner, Lazar Supic, Andreea Danielescu, Giacomo Indiveri, E. Paxon Frady, Friedrich T. Sommer, Yulia Sandamirskaya
The VO network we propose generates and stores a working memory of the presented visual environment.
no code implementations • 26 Aug 2022 • Alpha Renner, Lazar Supic, Andreea Danielescu, Giacomo Indiveri, Bruno A. Olshausen, Yulia Sandamirskaya, Friedrich T. Sommer, E. Paxon Frady
Understanding a visual scene by inferring identities and poses of its individual objects is still and open problem.
no code implementations • 5 Apr 2022 • Connor Bybee, Alexander Belsten, Friedrich T. Sommer
We show that for values of $Q$ which are the same as the ratio of $\gamma$ to $\theta$ oscillations observed in the hippocampus and the cortex, the associative memory achieves greater capacity and information storage than previous models.
no code implementations • 1 Apr 2022 • Connor Bybee, E. Paxon Frady, Friedrich T. Sommer
Spiking Neural Networks (SNNs) have attracted the attention of the deep learning community for use in low-latency, low-power neuromorphic hardware, as well as models for understanding neuroscience.
no code implementations • 2 Mar 2022 • Denis Kleyko, Connor Bybee, Christopher J. Kymn, Bruno A. Olshausen, Amir Khosrowshahi, Dmitri E. Nikonov, Friedrich T. Sommer, E. Paxon Frady
In this paper, we present an approach to integer factorization using distributed representations formed with Vector Symbolic Architectures.
no code implementations • 26 Jan 2022 • Christopher Warner, Kiersten Ruda, Friedrich T. Sommer
The model infers sparse activity in a set of binary latent variables, each describing the activity of a cell assembly.
1 code implementation • 24 Jan 2022 • Zengyi Li, Yubei Chen, Yann Lecun, Friedrich T. Sommer
We argue that achieving manifold clustering with neural networks requires two essential ingredients: a domain-specific constraint that ensures the identification of the manifolds, and a learning algorithm for embedding each manifold to a linear subspace in the feature space.
no code implementations • 5 Nov 2021 • Garrick Orchard, E. Paxon Frady, Daniel Ben Dayan Rubin, Sophia Sanborn, Sumit Bam Shrestha, Friedrich T. Sommer, Mike Davies
The biologically inspired spiking neurons used in neuromorphic computing are nonlinear filters with dynamic state variables -- very different from the stateless neuron models used in deep learning.
no code implementations • 8 Sep 2021 • E. Paxon Frady, Denis Kleyko, Christopher J. Kymn, Bruno A. Olshausen, Friedrich T. Sommer
By analogy to VSA, we call this new function encoding and computing framework Vector Function Architecture (VFA).
no code implementations • 9 Jun 2021 • Denis Kleyko, Mike Davies, E. Paxon Frady, Pentti Kanerva, Spencer J. Kent, Bruno A. Olshausen, Evgeny Osipov, Jan M. Rabaey, Dmitri A. Rachkovskij, Abbas Rahimi, Friedrich T. Sommer
We see them acting as a framework for computing with distributed representations that can play a role of an abstraction layer for emerging computing hardware.
no code implementations • 14 Dec 2020 • Denis Kleyko, Antonello Rosato, E. Paxon Frady, Massimo Panella, Friedrich T. Sommer
The perceptron theory compares favorably to other methods that do not rely on training an estimator model.
no code implementations • 7 Oct 2020 • Denis Kleyko, E. Paxon Frady, Friedrich T. Sommer
The variable encoding uses a fixed set of random patterns, which has to be stored and kept available during the computation.
1 code implementation • 7 Oct 2020 • Zengyi Li, Yubei Chen, Friedrich T. Sommer
However, in the continuous case, unfavorable geometry of the target distribution can greatly limit the efficiency of MCMC methods.
no code implementations • 14 Sep 2020 • E. Paxon Frady, Denis Kleyko, Friedrich T. Sommer
One binding method for general sparse vectors extends earlier proposals to reduce the tensor product into a vector, such as circular convolution.
no code implementations • 7 Jul 2020 • E. Paxon Frady, Spencer Kent, Bruno A. Olshausen, Friedrich T. Sommer
The ability to encode and manipulate data structures with distributed neural representations could qualitatively enhance the capabilities of traditional neural networks by supporting rule-based symbolic reasoning, a central property of cognition.
no code implementations • 6 May 2020 • Christopher Warner, Friedrich T. Sommer
Networks with phase interactions set by standard representations of the feature graph (adjacency matrix, Graph Laplacian or modularity) failed to exhibit segmentation performance significantly over the baseline, a model of independent sensors.
no code implementations • 4 May 2020 • Zengyi Li, Friedrich T. Sommer
We extend the framework of Boltzmann machines to a network of complex-valued neurons with variable amplitudes, referred to as Complex Amplitude-Phase Boltzmann machine (CAP-BM).
no code implementations • 27 Apr 2020 • E. Paxon Frady, Garrick Orchard, David Florey, Nabil Imam, Ruokun Liu, Joyesh Mishra, Jonathan Tse, Andreas Wild, Friedrich T. Sommer, Mike Davies
Neuromorphic computing applies insights from neuroscience to uncover innovations in computing technology.
2 code implementations • 17 Oct 2019 • Zengyi Li, Yubei Chen, Friedrich T. Sommer
Recently, \citet{song2019generative} have shown that a generative model trained by denoising score matching accomplishes excellent sample synthesis, when trained with data samples corrupted with multiple levels of noise.
no code implementations • 25 Sep 2019 • Zengyi Li, Yubei Chen, Friedrich T. Sommer
Energy based models outputs unmormalized log-probability values given datasamples.
no code implementations • 19 Jun 2019 • Spencer J. Kent, E. Paxon Frady, Friedrich T. Sommer, Bruno A. Olshausen
We develop theoretical foundations of Resonator Networks, a new type of recurrent neural network introduced in Frady et al. (2020) to solve a high-dimensional vector factorization problem arising in Vector Symbolic Architectures.
no code implementations • 23 Jan 2019 • E. Paxon Frady, Friedrich T. Sommer
Here, we propose a novel type of attractor neural network in complex state space, and show how it can be leveraged to construct spiking neural networks with robust computational properties through a phase-to-timing mapping.
no code implementations • 28 Feb 2018 • E. Paxon Frady, Denis Kleyko, Friedrich T. Sommer
The theory describes linear readout of analog data, and readout with winner-take-all error correction of symbolic data as proposed in VSA models.
no code implementations • 5 Jul 2017 • E. Paxon Frady, Denis Kleyko, Friedrich T. Sommer
The superposition operations in different existing models are mapped to linear neural networks with unitary recurrent matrices, in which retrieval accuracy can be analyzed by a single equation.
no code implementations • 10 Jun 2016 • Jesse A. Livezey, Alejandro F. Bujan, Friedrich T. Sommer
Further, by comparing ICA algorithms on synthetic data and natural images to the computationally more expensive sparse coding solution, we show that the coherence control biases the exploration of the data manifold, sometimes yielding suboptimal solutions.