Search Results for author: Bruno A. Olshausen

Found 23 papers, 3 papers with code

Compositional Factorization of Visual Scenes with Convolutional Sparse Coding and Resonator Networks

no code implementations29 Apr 2024 Christopher J. Kymn, Sonia Mazelet, Annabel Ng, Denis Kleyko, Bruno A. Olshausen

We propose a system for visual scene analysis and recognition based on encoding the sparse, latent feature-representation of an image into a high-dimensional vector that is subsequently factorized to parse scene content.

Scene Parsing

Computing with Residue Numbers in High-Dimensional Representation

no code implementations8 Nov 2023 Christopher J. Kymn, Denis Kleyko, E. Paxon Frady, Connor Bybee, Pentti Kanerva, Friedrich T. Sommer, Bruno A. Olshausen

We introduce Residue Hyperdimensional Computing, a computing framework that unifies residue number systems with an algebra defined over random, high-dimensional vectors.

Combinatorial Optimization

Efficient Decoding of Compositional Structure in Holistic Representations

no code implementations26 May 2023 Denis Kleyko, Connor Bybee, Ping-Chen Huang, Christopher J. Kymn, Bruno A. Olshausen, E. Paxon Frady, Friedrich T. Sommer

In particular, we find that the decoding techniques from the sparse coding and compressed sensing literature (rarely used for Hyperdimensional Computing/Vector Symbolic Architectures) are also well-suited for decoding information from the compositional distributed representations.

Retrieval

Learning and generalization of compositional representations of visual scenes

no code implementations23 Mar 2023 E. Paxon Frady, Spencer Kent, Quinn Tran, Pentti Kanerva, Bruno A. Olshausen, Friedrich T. Sommer

In contrast to learning category labels, here we train deep neural networks to output the full compositional vector description of an input image.

Object

Efficient Optimization with Higher-Order Ising Machines

no code implementations7 Dec 2022 Connor Bybee, Denis Kleyko, Dmitri E. Nikonov, Amir Khosrowshahi, Bruno A. Olshausen, Friedrich T. Sommer

A prominent approach to solving combinatorial optimization problems on parallel hardware is Ising machines, i. e., hardware implementations of networks of interacting binary spin variables.

Combinatorial Optimization

Computing with Hypervectors for Efficient Speaker Identification

no code implementations28 Aug 2022 Ping-Chen Huang, Denis Kleyko, Jan M. Rabaey, Bruno A. Olshausen, Pentti Kanerva

With only 1. 02k active parameters and a 128-minute pass through the training data we achieve Top-1 and Top-5 scores of 31% and 52% on the VoxCeleb1 dataset of 1, 251 speakers.

Quantization Speaker Identification

Learning and Inference in Sparse Coding Models with Langevin Dynamics

no code implementations23 Apr 2022 Michael Y. -S. Fang, Mayur Mudigonda, Ryan Zarcone, Amir Khosrowshahi, Bruno A. Olshausen

Moreover we show that Langevin dynamics lead to an efficient procedure for sampling from the posterior distribution in the 'L0 sparse' regime, where latent variables are encouraged to be set to zero as opposed to having a small L1 norm.

Integer Factorization with Compositional Distributed Representations

no code implementations2 Mar 2022 Denis Kleyko, Connor Bybee, Christopher J. Kymn, Bruno A. Olshausen, Amir Khosrowshahi, Dmitri E. Nikonov, Friedrich T. Sommer, E. Paxon Frady

In this paper, we present an approach to integer factorization using distributed representations formed with Vector Symbolic Architectures.

Computing on Functions Using Randomized Vector Representations

no code implementations8 Sep 2021 E. Paxon Frady, Denis Kleyko, Christopher J. Kymn, Bruno A. Olshausen, Friedrich T. Sommer

By analogy to VSA, we call this new function encoding and computing framework Vector Function Architecture (VFA).

Density Estimation

Resonator networks for factoring distributed representations of data structures

no code implementations7 Jul 2020 E. Paxon Frady, Spencer Kent, Bruno A. Olshausen, Friedrich T. Sommer

The ability to encode and manipulate data structures with distributed neural representations could qualitatively enhance the capabilities of traditional neural networks by supporting rule-based symbolic reasoning, a central property of cognition.

Word Embedding Visualization Via Dictionary Learning

1 code implementation9 Oct 2019 Juexiao Zhang, Yubei Chen, Brian Cheung, Bruno A. Olshausen

Co-occurrence statistics based word embedding techniques have proved to be very useful in extracting the semantic and syntactic representation of words as low dimensional continuous vectors.

Dictionary Learning

Resonator Networks outperform optimization methods at solving high-dimensional vector factorization

no code implementations19 Jun 2019 Spencer J. Kent, E. Paxon Frady, Friedrich T. Sommer, Bruno A. Olshausen

We develop theoretical foundations of Resonator Networks, a new type of recurrent neural network introduced in Frady et al. (2020) to solve a high-dimensional vector factorization problem arising in Vector Symbolic Architectures.

Vocal Bursts Intensity Prediction

The Sparse Manifold Transform

no code implementations NeurIPS 2018 Yubei Chen, Dylan M. Paiton, Bruno A. Olshausen

We present a signal representation framework called the sparse manifold transform that combines key ideas from sparse coding, manifold learning, and slow feature analysis.

Self-Supervised Learning

DeepMovie: Using Optical Flow and Deep Neural Networks to Stylize Movies

no code implementations26 May 2016 Alexander G. Anderson, Cory P. Berg, Daniel P. Mossing, Bruno A. Olshausen

The other naive method that initializes the optimization for the next frame using the rendered version of the previous frame also produces poor results because the features of the texture stay fixed relative to the frame of the movie instead of moving with objects in the scene.

Optical Flow Estimation Style Transfer

Discovering Hidden Factors of Variation in DeepNetworks

no code implementations arXiv 2015 Brian Cheung, Jesse A. Livezey, Arjun K. Bansal, Bruno A. Olshausen

Deep learning has enjoyed a great deal of success because of its ability to learnuseful features for tasks such as classification.

General Classification

Discovering Hidden Factors of Variation in Deep Networks

1 code implementation20 Dec 2014 Brian Cheung, Jesse A. Livezey, Arjun K. Bansal, Bruno A. Olshausen

Deep learning has enjoyed a great deal of success because of its ability to learn useful features for tasks such as classification.

General Classification

Group Sparse Coding with a Laplacian Scale Mixture Prior

no code implementations NeurIPS 2010 Pierre Garrigues, Bruno A. Olshausen

We show that, due to the conjugacy of the Gamma prior, it is possible to derive efficient inference procedures for both the coefficients and the scale parameter.

Compressive Sensing

An Unsupervised Algorithm For Learning Lie Group Transformations

no code implementations7 Jan 2010 Jascha Sohl-Dickstein, Ching Ming Wang, Bruno A. Olshausen

Transformation operators are represented in their eigen-basis, reducing the computational complexity of parameter estimation to that of training a linear transformation model.

Translation

Learning transport operators for image manifolds

no code implementations NeurIPS 2009 Benjamin Culpepper, Bruno A. Olshausen

We describe a method for learning a group of continuous transformation operators to traverse smooth nonlinear manifolds.

Learning Horizontal Connections in a Sparse Coding Model of Natural Images

no code implementations NeurIPS 2007 Pierre Garrigues, Bruno A. Olshausen

It has been shown that adapting a dictionary of basis functions to the statistics of natural images so as to maximize sparsity in the coefficients results in a set of dictionary elements whose spatial properties resemble those of V1 (primary visual cortex) receptive fields.

Cannot find the paper you are looking for? You can Submit a new open access paper.