Search Results for author: Evgeny Andriyash

Found 10 papers, 1 papers with code

A Path Towards Quantum Advantage in Training Deep Generative Models with Quantum Annealers

no code implementations4 Dec 2019 Walter Vinci, Lorenzo Buffoni, Hossein Sadeghi, Amir Khoshaman, Evgeny Andriyash, Mohammad H. Amin

The hybrid structure of QVAE allows us to deploy current-generation quantum annealers in QCH generative models to achieve competitive performance on datasets such as MNIST.

PixelVAE++: Improved PixelVAE with Discrete Prior

no code implementations26 Aug 2019 Hossein Sadeghi, Evgeny Andriyash, Walter Vinci, Lorenzo Buffoni, Mohammad H. Amin

Here we introduce PixelVAE++, a VAE with three types of latent variables and a PixelCNN++ for the decoder.

Ranked #22 on Image Generation on CIFAR-10 (bits/dimension metric)

Image Generation

Improved Gradient-Based Optimization Over Discrete Distributions

no code implementations29 Sep 2018 Evgeny Andriyash, Arash Vahdat, Bill Macready

In many applications we seek to maximize an expectation with respect to a distribution over discrete variables.

Variational Inference

Improved Gradient Estimators for Stochastic Discrete Variables

no code implementations27 Sep 2018 Evgeny Andriyash, Arash Vahdat, Bill Macready

In many applications we seek to optimize an expectation with respect to a distribution over discrete variables.

Variational Inference

DVAE#: Discrete Variational Autoencoders with Relaxed Boltzmann Priors

no code implementations NeurIPS 2018 Arash Vahdat, Evgeny Andriyash, William G. Macready

Experiments on the MNIST and OMNIGLOT datasets show that these relaxations outperform previous discrete VAEs with Boltzmann priors.

Quantum Variational Autoencoder

no code implementations15 Feb 2018 Amir Khoshaman, Walter Vinci, Brandon Denis, Evgeny Andriyash, Hossein Sadeghi, Mohammad H. Amin

We show that our model can be trained end-to-end by maximizing a well-defined loss-function: a 'quantum' lower-bound to a variational approximation of the log-likelihood.

DVAE++: Discrete Variational Autoencoders with Overlapping Transformations

no code implementations ICML 2018 Arash Vahdat, William G. Macready, Zhengbing Bian, Amir Khoshaman, Evgeny Andriyash

Training of discrete latent variable models remains challenging because passing gradient information through discrete units is difficult.

Ranked #53 on Image Generation on CIFAR-10 (bits/dimension metric)

Image Generation

Benchmarking Quantum Hardware for Training of Fully Visible Boltzmann Machines

no code implementations14 Nov 2016 Dmytro Korenkevych, Yanbo Xue, Zhengbing Bian, Fabian Chudak, William G. Macready, Jason Rolfe, Evgeny Andriyash

We argue that this relates to the fact that we are training a quantum rather than classical Boltzmann distribution in this case.

Benchmarking

Quantum Boltzmann Machine

no code implementations8 Jan 2016 Mohammad H. Amin, Evgeny Andriyash, Jason Rolfe, Bohdan Kulchytskyy, Roger Melko

Inspired by the success of Boltzmann Machines based on classical Boltzmann distribution, we propose a new machine learning approach based on quantum Boltzmann distribution of a transverse-field Ising Hamiltonian.

Quantum Physics

Cannot find the paper you are looking for? You can Submit a new open access paper.