Search Results for author: Michael Celentano

Found 8 papers, 0 papers with code

Exact and efficient phylodynamic simulation from arbitrarily large populations

no code implementations27 Feb 2024 Michael Celentano, William S. DeWitt, Sebastian Prillo, Yun S. Song

Consequently, the computational cost is determined not by the size of the final simulated tree, but by the size of the population tree in which it is embedded.

Mean-field variational inference with the TAP free energy: Geometric and statistical properties in linear models

no code implementations14 Nov 2023 Michael Celentano, Zhou Fan, Licong Lin, Song Mei

In settings where it is conjectured that no efficient algorithm can find this local neighborhood, we prove analogous geometric properties for a local minimizer of the TAP free energy reachable by AMP, and show that posterior inference based on this minimizer remains correctly calibrated.

Variational Inference

Maximum Mean Discrepancy Meets Neural Networks: The Radon-Kolmogorov-Smirnov Test

no code implementations5 Sep 2023 Seunghoon Paik, Michael Celentano, Alden Green, Ryan J. Tibshirani

Maximum mean discrepancy (MMD) refers to a general class of nonparametric two-sample tests that are based on maximizing the mean difference over samples from one distribution $P$ versus another $Q$, over all choices of data transformations $f$ living in some function space $\mathcal{F}$.

Sudakov-Fernique post-AMP, and a new proof of the local convexity of the TAP free energy

no code implementations19 Aug 2022 Michael Celentano

As an example of its use, we provide a new, and arguably simpler, proof of some of the results of Celentano et al. (2021), which establishes that the so-called TAP free energy in the $\mathbb{Z}_2$-synchronization problem is locally convex in the region to which AMP converges.

Local convexity of the TAP free energy and AMP convergence for Z2-synchronization

no code implementations21 Jun 2021 Michael Celentano, Zhou Fan, Song Mei

This provides a rigorous foundation for variational inference in high dimensions via minimization of the TAP free energy.

Bayesian Inference Variational Inference

Minimum complexity interpolation in random features models

no code implementations30 Mar 2021 Michael Celentano, Theodor Misiakiewicz, Andrea Montanari

We study random features approximations to these norms and show that, for $p>1$, the number of random features required to approximate the original learning problem is upper bounded by a polynomial in the sample size.

The Lasso with general Gaussian designs with applications to hypothesis testing

no code implementations27 Jul 2020 Michael Celentano, Andrea Montanari, Yuting Wei

On the other hand, the Lasso estimator can be precisely characterized in the regime in which both $n$ and $p$ are large and $n/p$ is of order one.

Two-sample testing valid

The estimation error of general first order methods

no code implementations28 Feb 2020 Michael Celentano, Andrea Montanari, Yuchen Wu

These lower bounds are optimal in the sense that there exist algorithms whose estimation error matches the lower bounds up to asymptotically negligible terms.

Retrieval

Cannot find the paper you are looking for? You can Submit a new open access paper.