Search Results for author: Friedrich T. Sommer

Found 29 papers, 3 papers with code

Computing with Residue Numbers in High-Dimensional Representation

no code implementations8 Nov 2023 Christopher J. Kymn, Denis Kleyko, E. Paxon Frady, Connor Bybee, Pentti Kanerva, Friedrich T. Sommer, Bruno A. Olshausen

We introduce Residue Hyperdimensional Computing, a computing framework that unifies residue number systems with an algebra defined over random, high-dimensional vectors.

Combinatorial Optimization

Efficient Decoding of Compositional Structure in Holistic Representations

no code implementations26 May 2023 Denis Kleyko, Connor Bybee, Ping-Chen Huang, Christopher J. Kymn, Bruno A. Olshausen, E. Paxon Frady, Friedrich T. Sommer

In particular, we find that the decoding techniques from the sparse coding and compressed sensing literature (rarely used for Hyperdimensional Computing/Vector Symbolic Architectures) are also well-suited for decoding information from the compositional distributed representations.

Retrieval

Learning and generalization of compositional representations of visual scenes

no code implementations23 Mar 2023 E. Paxon Frady, Spencer Kent, Quinn Tran, Pentti Kanerva, Bruno A. Olshausen, Friedrich T. Sommer

In contrast to learning category labels, here we train deep neural networks to output the full compositional vector description of an input image.

Object

Efficient Optimization with Higher-Order Ising Machines

no code implementations7 Dec 2022 Connor Bybee, Denis Kleyko, Dmitri E. Nikonov, Amir Khosrowshahi, Bruno A. Olshausen, Friedrich T. Sommer

A prominent approach to solving combinatorial optimization problems on parallel hardware is Ising machines, i. e., hardware implementations of networks of interacting binary spin variables.

Combinatorial Optimization

Cross-Frequency Coupling Increases Memory Capacity in Oscillatory Neural Networks

no code implementations5 Apr 2022 Connor Bybee, Alexander Belsten, Friedrich T. Sommer

We show that for values of $Q$ which are the same as the ratio of $\gamma$ to $\theta$ oscillations observed in the hippocampus and the cortex, the associative memory achieves greater capacity and information storage than previous models.

Hippocampus Retrieval

Deep Learning in Spiking Phasor Neural Networks

no code implementations1 Apr 2022 Connor Bybee, E. Paxon Frady, Friedrich T. Sommer

Spiking Neural Networks (SNNs) have attracted the attention of the deep learning community for use in low-latency, low-power neuromorphic hardware, as well as models for understanding neuroscience.

Integer Factorization with Compositional Distributed Representations

no code implementations2 Mar 2022 Denis Kleyko, Connor Bybee, Christopher J. Kymn, Bruno A. Olshausen, Amir Khosrowshahi, Dmitri E. Nikonov, Friedrich T. Sommer, E. Paxon Frady

In this paper, we present an approach to integer factorization using distributed representations formed with Vector Symbolic Architectures.

A probabilistic latent variable model for detecting structure in binary data

no code implementations26 Jan 2022 Christopher Warner, Kiersten Ruda, Friedrich T. Sommer

The model infers sparse activity in a set of binary latent variables, each describing the activity of a cell assembly.

Neural Manifold Clustering and Embedding

1 code implementation24 Jan 2022 Zengyi Li, Yubei Chen, Yann Lecun, Friedrich T. Sommer

We argue that achieving manifold clustering with neural networks requires two essential ingredients: a domain-specific constraint that ensures the identification of the manifolds, and a learning algorithm for embedding each manifold to a linear subspace in the feature space.

Clustering Data Augmentation +2

Efficient Neuromorphic Signal Processing with Loihi 2

no code implementations5 Nov 2021 Garrick Orchard, E. Paxon Frady, Daniel Ben Dayan Rubin, Sophia Sanborn, Sumit Bam Shrestha, Friedrich T. Sommer, Mike Davies

The biologically inspired spiking neurons used in neuromorphic computing are nonlinear filters with dynamic state variables -- very different from the stateless neuron models used in deep learning.

Audio Classification Optical Flow Estimation

Computing on Functions Using Randomized Vector Representations

no code implementations8 Sep 2021 E. Paxon Frady, Denis Kleyko, Christopher J. Kymn, Bruno A. Olshausen, Friedrich T. Sommer

By analogy to VSA, we call this new function encoding and computing framework Vector Function Architecture (VFA).

Density Estimation

Cellular Automata Can Reduce Memory Requirements of Collective-State Computing

no code implementations7 Oct 2020 Denis Kleyko, E. Paxon Frady, Friedrich T. Sommer

The variable encoding uses a fixed set of random patterns, which has to be stored and kept available during the computation.

A Neural Network MCMC sampler that maximizes Proposal Entropy

1 code implementation7 Oct 2020 Zengyi Li, Yubei Chen, Friedrich T. Sommer

However, in the continuous case, unfavorable geometry of the target distribution can greatly limit the efficiency of MCMC methods.

Variable Binding for Sparse Distributed Representations: Theory and Applications

no code implementations14 Sep 2020 E. Paxon Frady, Denis Kleyko, Friedrich T. Sommer

One binding method for general sparse vectors extends earlier proposals to reduce the tensor product into a vector, such as circular convolution.

Resonator networks for factoring distributed representations of data structures

no code implementations7 Jul 2020 E. Paxon Frady, Spencer Kent, Bruno A. Olshausen, Friedrich T. Sommer

The ability to encode and manipulate data structures with distributed neural representations could qualitatively enhance the capabilities of traditional neural networks by supporting rule-based symbolic reasoning, a central property of cognition.

A Model for Image Segmentation in Retina

no code implementations6 May 2020 Christopher Warner, Friedrich T. Sommer

Networks with phase interactions set by standard representations of the feature graph (adjacency matrix, Graph Laplacian or modularity) failed to exhibit segmentation performance significantly over the baseline, a model of independent sensors.

Clustering Graph Clustering +3

Complex Amplitude-Phase Boltzmann Machines

no code implementations4 May 2020 Zengyi Li, Friedrich T. Sommer

We extend the framework of Boltzmann machines to a network of complex-valued neurons with variable amplitudes, referred to as Complex Amplitude-Phase Boltzmann machine (CAP-BM).

Learning Energy-Based Models in High-Dimensional Spaces with Multi-scale Denoising Score Matching

2 code implementations17 Oct 2019 Zengyi Li, Yubei Chen, Friedrich T. Sommer

Recently, \citet{song2019generative} have shown that a generative model trained by denoising score matching accomplishes excellent sample synthesis, when trained with data samples corrupted with multiple levels of noise.

Denoising Image Inpainting +1

Resonator Networks outperform optimization methods at solving high-dimensional vector factorization

no code implementations19 Jun 2019 Spencer J. Kent, E. Paxon Frady, Friedrich T. Sommer, Bruno A. Olshausen

We develop theoretical foundations of Resonator Networks, a new type of recurrent neural network introduced in Frady et al. (2020) to solve a high-dimensional vector factorization problem arising in Vector Symbolic Architectures.

Vocal Bursts Intensity Prediction

Robust computation with rhythmic spike patterns

no code implementations23 Jan 2019 E. Paxon Frady, Friedrich T. Sommer

Here, we propose a novel type of attractor neural network in complex state space, and show how it can be leveraged to construct spiking neural networks with robust computational properties through a phase-to-timing mapping.

A theory of sequence indexing and working memory in recurrent neural networks

no code implementations28 Feb 2018 E. Paxon Frady, Denis Kleyko, Friedrich T. Sommer

The theory describes linear readout of analog data, and readout with winner-take-all error correction of symbolic data as proposed in VSA models.

Retrieval

Theory of the superposition principle for randomized connectionist representations in neural networks

no code implementations5 Jul 2017 E. Paxon Frady, Denis Kleyko, Friedrich T. Sommer

The superposition operations in different existing models are mapped to linear neural networks with unitary recurrent matrices, in which retrieval accuracy can be analyzed by a single equation.

Retrieval

Learning overcomplete, low coherence dictionaries with linear inference

no code implementations10 Jun 2016 Jesse A. Livezey, Alejandro F. Bujan, Friedrich T. Sommer

Further, by comparing ICA algorithms on synthetic data and natural images to the computationally more expensive sparse coding solution, we show that the coherence control biases the exploration of the data manifold, sometimes yielding suboptimal solutions.

Cannot find the paper you are looking for? You can Submit a new open access paper.