Search Results for author: Amir Khosrowshahi

Found 6 papers, 1 papers with code

Efficient Optimization with Higher-Order Ising Machines

no code implementations7 Dec 2022 Connor Bybee, Denis Kleyko, Dmitri E. Nikonov, Amir Khosrowshahi, Bruno A. Olshausen, Friedrich T. Sommer

A prominent approach to solving combinatorial optimization problems on parallel hardware is Ising machines, i. e., hardware implementations of networks of interacting binary spin variables.

Combinatorial Optimization

Learning and Inference in Sparse Coding Models with Langevin Dynamics

no code implementations23 Apr 2022 Michael Y. -S. Fang, Mayur Mudigonda, Ryan Zarcone, Amir Khosrowshahi, Bruno A. Olshausen

Moreover we show that Langevin dynamics lead to an efficient procedure for sampling from the posterior distribution in the 'L0 sparse' regime, where latent variables are encouraged to be set to zero as opposed to having a small L1 norm.

Integer Factorization with Compositional Distributed Representations

no code implementations2 Mar 2022 Denis Kleyko, Connor Bybee, Christopher J. Kymn, Bruno A. Olshausen, Amir Khosrowshahi, Dmitri E. Nikonov, Friedrich T. Sommer, E. Paxon Frady

In this paper, we present an approach to integer factorization using distributed representations formed with Vector Symbolic Architectures.

Design of optical neural networks with component imprecisions

1 code implementation13 Dec 2019 Michael Y. -S. Fang, Sasikanth Manipatruni, Casimir Wierzynski, Amir Khosrowshahi, Michael R. DeWeese

For the benefit of designing scalable, fault resistant optical neural networks (ONNs), we investigate the effects architectural designs have on the ONNs' robustness to imprecise components.

Flexpoint: An Adaptive Numerical Format for Efficient Training of Deep Neural Networks

no code implementations NeurIPS 2017 Urs Köster, Tristan J. Webb, Xin Wang, Marcel Nassar, Arjun K. Bansal, William H. Constable, Oğuz H. Elibol, Scott Gray, Stewart Hall, Luke Hornof, Amir Khosrowshahi, Carey Kloss, Ruby J. Pai, Naveen Rao

Here we present the Flexpoint data format, aiming at a complete replacement of 32-bit floating point format training and inference, designed to support modern deep network topologies without modifications.

Generative Adversarial Network

Cannot find the paper you are looking for? You can Submit a new open access paper.