Search Results for author: E. Paxon Frady

Found 22 papers, 1 papers with code

Computing with Residue Numbers in High-Dimensional Representation

no code implementations8 Nov 2023 Christopher J. Kymn, Denis Kleyko, E. Paxon Frady, Connor Bybee, Pentti Kanerva, Friedrich T. Sommer, Bruno A. Olshausen

We introduce Residue Hyperdimensional Computing, a computing framework that unifies residue number systems with an algebra defined over random, high-dimensional vectors.

Combinatorial Optimization

Efficient Decoding of Compositional Structure in Holistic Representations

no code implementations26 May 2023 Denis Kleyko, Connor Bybee, Ping-Chen Huang, Christopher J. Kymn, Bruno A. Olshausen, E. Paxon Frady, Friedrich T. Sommer

In particular, we find that the decoding techniques from the sparse coding and compressed sensing literature (rarely used for Hyperdimensional Computing/Vector Symbolic Architectures) are also well-suited for decoding information from the compositional distributed representations.

Retrieval

Learning and generalization of compositional representations of visual scenes

no code implementations23 Mar 2023 E. Paxon Frady, Spencer Kent, Quinn Tran, Pentti Kanerva, Bruno A. Olshausen, Friedrich T. Sommer

In contrast to learning category labels, here we train deep neural networks to output the full compositional vector description of an input image.

Object

Deep Learning in Spiking Phasor Neural Networks

no code implementations1 Apr 2022 Connor Bybee, E. Paxon Frady, Friedrich T. Sommer

Spiking Neural Networks (SNNs) have attracted the attention of the deep learning community for use in low-latency, low-power neuromorphic hardware, as well as models for understanding neuroscience.

Integer Factorization with Compositional Distributed Representations

no code implementations2 Mar 2022 Denis Kleyko, Connor Bybee, Christopher J. Kymn, Bruno A. Olshausen, Amir Khosrowshahi, Dmitri E. Nikonov, Friedrich T. Sommer, E. Paxon Frady

In this paper, we present an approach to integer factorization using distributed representations formed with Vector Symbolic Architectures.

Efficient Neuromorphic Signal Processing with Loihi 2

no code implementations5 Nov 2021 Garrick Orchard, E. Paxon Frady, Daniel Ben Dayan Rubin, Sophia Sanborn, Sumit Bam Shrestha, Friedrich T. Sommer, Mike Davies

The biologically inspired spiking neurons used in neuromorphic computing are nonlinear filters with dynamic state variables -- very different from the stateless neuron models used in deep learning.

Audio Classification Optical Flow Estimation

Computing on Functions Using Randomized Vector Representations

no code implementations8 Sep 2021 E. Paxon Frady, Denis Kleyko, Christopher J. Kymn, Bruno A. Olshausen, Friedrich T. Sommer

By analogy to VSA, we call this new function encoding and computing framework Vector Function Architecture (VFA).

Density Estimation

Cellular Automata Can Reduce Memory Requirements of Collective-State Computing

no code implementations7 Oct 2020 Denis Kleyko, E. Paxon Frady, Friedrich T. Sommer

The variable encoding uses a fixed set of random patterns, which has to be stored and kept available during the computation.

Variable Binding for Sparse Distributed Representations: Theory and Applications

no code implementations14 Sep 2020 E. Paxon Frady, Denis Kleyko, Friedrich T. Sommer

One binding method for general sparse vectors extends earlier proposals to reduce the tensor product into a vector, such as circular convolution.

Resonator networks for factoring distributed representations of data structures

no code implementations7 Jul 2020 E. Paxon Frady, Spencer Kent, Bruno A. Olshausen, Friedrich T. Sommer

The ability to encode and manipulate data structures with distributed neural representations could qualitatively enhance the capabilities of traditional neural networks by supporting rule-based symbolic reasoning, a central property of cognition.

Density Encoding Enables Resource-Efficient Randomly Connected Neural Networks

3 code implementations19 Sep 2019 Denis Kleyko, Mansour Kheffache, E. Paxon Frady, Urban Wiklund, Evgeny Osipov

The deployment of machine learning algorithms on resource-constrained edge devices is an important challenge from both theoretical and applied points of view.

BIG-bench Machine Learning

Resonator Networks outperform optimization methods at solving high-dimensional vector factorization

no code implementations19 Jun 2019 Spencer J. Kent, E. Paxon Frady, Friedrich T. Sommer, Bruno A. Olshausen

We develop theoretical foundations of Resonator Networks, a new type of recurrent neural network introduced in Frady et al. (2020) to solve a high-dimensional vector factorization problem arising in Vector Symbolic Architectures.

Vocal Bursts Intensity Prediction

Robust computation with rhythmic spike patterns

no code implementations23 Jan 2019 E. Paxon Frady, Friedrich T. Sommer

Here, we propose a novel type of attractor neural network in complex state space, and show how it can be leveraged to construct spiking neural networks with robust computational properties through a phase-to-timing mapping.

A theory of sequence indexing and working memory in recurrent neural networks

no code implementations28 Feb 2018 E. Paxon Frady, Denis Kleyko, Friedrich T. Sommer

The theory describes linear readout of analog data, and readout with winner-take-all error correction of symbolic data as proposed in VSA models.

Retrieval

Theory of the superposition principle for randomized connectionist representations in neural networks

no code implementations5 Jul 2017 E. Paxon Frady, Denis Kleyko, Friedrich T. Sommer

The superposition operations in different existing models are mapped to linear neural networks with unitary recurrent matrices, in which retrieval accuracy can be analyzed by a single equation.

Retrieval

Integer Echo State Networks: Efficient Reservoir Computing for Digital Hardware

no code implementations1 Jun 2017 Denis Kleyko, E. Paxon Frady, Mansour Kheffache, Evgeny Osipov

We propose an approximation of Echo State Networks (ESN) that can be efficiently implemented on digital hardware based on the mathematics of hyperdimensional computing.

Computational Efficiency Time Series +1

Inferring and Learning from Neuronal Correspondences

no code implementations23 Jan 2015 Ashish Kapoor, E. Paxon Frady, Stefanie Jegelka, William B. Kristan, Eric Horvitz

We introduce and study methods for inferring and learning from correspondences among neurons.

Decision Making

Cannot find the paper you are looking for? You can Submit a new open access paper.