no code implementations • 8 Nov 2023 • Christopher J. Kymn, Denis Kleyko, E. Paxon Frady, Connor Bybee, Pentti Kanerva, Friedrich T. Sommer, Bruno A. Olshausen
We introduce Residue Hyperdimensional Computing, a computing framework that unifies residue number systems with an algebra defined over random, high-dimensional vectors.
no code implementations • 26 May 2023 • Denis Kleyko, Connor Bybee, Ping-Chen Huang, Christopher J. Kymn, Bruno A. Olshausen, E. Paxon Frady, Friedrich T. Sommer
In particular, we find that the decoding techniques from the sparse coding and compressed sensing literature (rarely used for Hyperdimensional Computing/Vector Symbolic Architectures) are also well-suited for decoding information from the compositional distributed representations.
no code implementations • 23 Mar 2023 • E. Paxon Frady, Spencer Kent, Quinn Tran, Pentti Kanerva, Bruno A. Olshausen, Friedrich T. Sommer
In contrast to learning category labels, here we train deep neural networks to output the full compositional vector description of an input image.
no code implementations • 5 Sep 2022 • Alpha Renner, Lazar Supic, Andreea Danielescu, Giacomo Indiveri, E. Paxon Frady, Friedrich T. Sommer, Yulia Sandamirskaya
The VO network we propose generates and stores a working memory of the presented visual environment.
no code implementations • 26 Aug 2022 • Alpha Renner, Lazar Supic, Andreea Danielescu, Giacomo Indiveri, Bruno A. Olshausen, Yulia Sandamirskaya, Friedrich T. Sommer, E. Paxon Frady
Understanding a visual scene by inferring identities and poses of its individual objects is still and open problem.
no code implementations • 1 Apr 2022 • Connor Bybee, E. Paxon Frady, Friedrich T. Sommer
Spiking Neural Networks (SNNs) have attracted the attention of the deep learning community for use in low-latency, low-power neuromorphic hardware, as well as models for understanding neuroscience.
no code implementations • 2 Mar 2022 • Denis Kleyko, Connor Bybee, Christopher J. Kymn, Bruno A. Olshausen, Amir Khosrowshahi, Dmitri E. Nikonov, Friedrich T. Sommer, E. Paxon Frady
In this paper, we present an approach to integer factorization using distributed representations formed with Vector Symbolic Architectures.
no code implementations • 5 Nov 2021 • Garrick Orchard, E. Paxon Frady, Daniel Ben Dayan Rubin, Sophia Sanborn, Sumit Bam Shrestha, Friedrich T. Sommer, Mike Davies
The biologically inspired spiking neurons used in neuromorphic computing are nonlinear filters with dynamic state variables -- very different from the stateless neuron models used in deep learning.
no code implementations • 8 Sep 2021 • E. Paxon Frady, Denis Kleyko, Christopher J. Kymn, Bruno A. Olshausen, Friedrich T. Sommer
By analogy to VSA, we call this new function encoding and computing framework Vector Function Architecture (VFA).
no code implementations • 9 Jun 2021 • Denis Kleyko, Mike Davies, E. Paxon Frady, Pentti Kanerva, Spencer J. Kent, Bruno A. Olshausen, Evgeny Osipov, Jan M. Rabaey, Dmitri A. Rachkovskij, Abbas Rahimi, Friedrich T. Sommer
We see them acting as a framework for computing with distributed representations that can play a role of an abstraction layer for emerging computing hardware.
no code implementations • 14 Dec 2020 • Denis Kleyko, Antonello Rosato, E. Paxon Frady, Massimo Panella, Friedrich T. Sommer
The perceptron theory compares favorably to other methods that do not rely on training an estimator model.
no code implementations • 7 Oct 2020 • Denis Kleyko, E. Paxon Frady, Friedrich T. Sommer
The variable encoding uses a fixed set of random patterns, which has to be stored and kept available during the computation.
no code implementations • 14 Sep 2020 • E. Paxon Frady, Denis Kleyko, Friedrich T. Sommer
One binding method for general sparse vectors extends earlier proposals to reduce the tensor product into a vector, such as circular convolution.
no code implementations • 7 Jul 2020 • E. Paxon Frady, Spencer Kent, Bruno A. Olshausen, Friedrich T. Sommer
The ability to encode and manipulate data structures with distributed neural representations could qualitatively enhance the capabilities of traditional neural networks by supporting rule-based symbolic reasoning, a central property of cognition.
no code implementations • 27 Apr 2020 • E. Paxon Frady, Garrick Orchard, David Florey, Nabil Imam, Ruokun Liu, Joyesh Mishra, Jonathan Tse, Andreas Wild, Friedrich T. Sommer, Mike Davies
Neuromorphic computing applies insights from neuroscience to uncover innovations in computing technology.
3 code implementations • 19 Sep 2019 • Denis Kleyko, Mansour Kheffache, E. Paxon Frady, Urban Wiklund, Evgeny Osipov
The deployment of machine learning algorithms on resource-constrained edge devices is an important challenge from both theoretical and applied points of view.
no code implementations • 19 Jun 2019 • Spencer J. Kent, E. Paxon Frady, Friedrich T. Sommer, Bruno A. Olshausen
We develop theoretical foundations of Resonator Networks, a new type of recurrent neural network introduced in Frady et al. (2020) to solve a high-dimensional vector factorization problem arising in Vector Symbolic Architectures.
no code implementations • 23 Jan 2019 • E. Paxon Frady, Friedrich T. Sommer
Here, we propose a novel type of attractor neural network in complex state space, and show how it can be leveraged to construct spiking neural networks with robust computational properties through a phase-to-timing mapping.
no code implementations • 28 Feb 2018 • E. Paxon Frady, Denis Kleyko, Friedrich T. Sommer
The theory describes linear readout of analog data, and readout with winner-take-all error correction of symbolic data as proposed in VSA models.
no code implementations • 5 Jul 2017 • E. Paxon Frady, Denis Kleyko, Friedrich T. Sommer
The superposition operations in different existing models are mapped to linear neural networks with unitary recurrent matrices, in which retrieval accuracy can be analyzed by a single equation.
no code implementations • 1 Jun 2017 • Denis Kleyko, E. Paxon Frady, Mansour Kheffache, Evgeny Osipov
We propose an approximation of Echo State Networks (ESN) that can be efficiently implemented on digital hardware based on the mathematics of hyperdimensional computing.
no code implementations • 23 Jan 2015 • Ashish Kapoor, E. Paxon Frady, Stefanie Jegelka, William B. Kristan, Eric Horvitz
We introduce and study methods for inferring and learning from correspondences among neurons.