Search Results for author: Marc Vuffray

Found 14 papers, 7 papers with code

High-quality Thermal Gibbs Sampling with Quantum Annealing Hardware

no code implementations3 Sep 2021 Jon Nelson, Marc Vuffray, Andrey Y. Lokhov, Tameem Albash, Carleton Coffrin

This work builds on those insights and identifies a class of small hardware-native Ising models that are robust to noise effects and proposes a procedure for executing these models on QA hardware to maximize Gibbs sampling performance.

Combinatorial Optimization

Single-Qubit Fidelity Assessment of Quantum Annealing Hardware

1 code implementation7 Apr 2021 Jon Nelson, Marc Vuffray, Andrey Y. Lokhov, Carleton Coffrin

Overall, the proposed QASA protocol provides a useful tool for assessing the performance of current and emerging quantum annealing devices.

Exponential Reduction in Sample Complexity with Learning of Ising Model Dynamics

1 code implementation2 Apr 2021 Arkopal Dutt, Andrey Y. Lokhov, Marc Vuffray, Sidhant Misra

We observe that for samples coming from a dynamical process far from equilibrium, the sample complexity reduces exponentially compared to a dynamical process that mixes quickly.

Learning Continuous Exponential Families Beyond Gaussian

1 code implementation18 Feb 2021 Christopher X. Ren, Sidhant Misra, Marc Vuffray, Andrey Y. Lokhov

We address the problem of learning of continuous exponential family distributions with unbounded support.

Programmable Quantum Annealers as Noisy Gibbs Samplers

no code implementations16 Dec 2020 Marc Vuffray, Carleton Coffrin, Yaroslav A. Kharkov, Andrey Y. Lokhov

Drawing independent samples from high-dimensional probability distributions represents the major computational bottleneck for modern algorithms, including powerful machine learning frameworks such as deep learning.

Learning of Discrete Graphical Models with Neural Networks

1 code implementation NeurIPS 2020 Abhijith J., Andrey Y. Lokhov, Sidhant Misra, Marc Vuffray

In addition, we also show a variant of NeurISE that can be used to learn a neural net representation for the full energy function of the true model.

Efficient Learning of Discrete Graphical Models

1 code implementation NeurIPS 2020 Marc Vuffray, Sidhant Misra, Andrey Y. Lokhov

We identify a single condition related to model parametrization that leads to rigorous guarantees on the recovery of model structure and parameters in any error norm, and is readily verifiable for a large class of models.

Online Learning of Power Transmission Dynamics

no code implementations27 Oct 2017 Andrey Y. Lokhov, Marc Vuffray, Dmitry Shemetov, Deepjyoti Deka, Michael Chertkov

We consider the problem of reconstructing the dynamic state matrix of transmission power grids from time-stamped PMU measurements in the regime of ambient fluctuations.

online learning

Information Theoretic Optimal Learning of Gaussian Graphical Models

no code implementations15 Mar 2017 Sidhant Misra, Marc Vuffray, Andrey Y. Lokhov

What is the optimal number of independent observations from which a sparse Gaussian Graphical Model can be correctly recovered?

Graph Reconstruction

Optimal structure and parameter learning of Ising models

1 code implementation15 Dec 2016 Andrey Y. Lokhov, Marc Vuffray, Sidhant Misra, Michael Chertkov

Reconstruction of structure and parameters of an Ising model from binary samples is a problem of practical importance in a variety of disciplines, ranging from statistical physics and computational biology to image processing and machine learning.

Graphical Models for Optimal Power Flow

no code implementations21 Jun 2016 Krishnamurthy Dvijotham, Pascal Van Hentenryck, Michael Chertkov, Sidhant Misra, Marc Vuffray

In this paper, we formulate the optimal power flow problem over tree networks as an inference problem over a tree-structured graphical model where the nodal variables are low-dimensional vectors.

Interaction Screening: Efficient and Sample-Optimal Learning of Ising Models

no code implementations NeurIPS 2016 Marc Vuffray, Sidhant Misra, Andrey Y. Lokhov, Michael Chertkov

We prove that with appropriate regularization, the estimator recovers the underlying graph using a number of samples that is logarithmic in the system size p and exponential in the maximum coupling-intensity and maximum node-degree.

Cannot find the paper you are looking for? You can Submit a new open access paper.