Search Results for author: Rares-Darius Buhai

Found 4 papers, 1 papers with code

Beyond Parallel Pancakes: Quasi-Polynomial Time Guarantees for Non-Spherical Gaussian Mixtures

no code implementations10 Dec 2021 Rares-Darius Buhai, David Steurer

The reason is that such outliers can simulate exponentially small mixing weights even for mixtures with polynomially lower bounded mixing weights.

Learning Restricted Boltzmann Machines with Sparse Latent Variables

no code implementations NeurIPS 2020 Guy Bresler, Rares-Darius Buhai

In this paper, we give an algorithm for learning general RBMs with time complexity $\tilde{O}(n^{2^s+1})$, where $s$ is the maximum number of latent variables connected to the MRF neighborhood of an observed variable.

Benefits of Overparameterization in Single-Layer Latent Variable Generative Models

no code implementations25 Sep 2019 Rares-Darius Buhai, Andrej Risteski, Yoni Halpern, David Sontag

One of the most surprising and exciting discoveries in supervising learning was the benefit of overparameterization (i. e. training a very large model) to improving the optimization landscape of a problem, with minimal effect on statistical performance (i. e. generalization).

Variational Inference

Empirical Study of the Benefits of Overparameterization in Learning Latent Variable Models

1 code implementation ICML 2020 Rares-Darius Buhai, Yoni Halpern, Yoon Kim, Andrej Risteski, David Sontag

One of the most surprising and exciting discoveries in supervised learning was the benefit of overparameterization (i. e. training a very large model) to improving the optimization landscape of a problem, with minimal effect on statistical performance (i. e. generalization).

Variational Inference

Cannot find the paper you are looking for? You can Submit a new open access paper.