Search Results for author: Yian Ma

Found 9 papers, 1 papers with code

Faster Sampling without Isoperimetry via Diffusion-based Monte Carlo

no code implementations12 Jan 2024 Xunpeng Huang, Difan Zou, Hanze Dong, Yian Ma, Tong Zhang

Specifically, DMC follows the reverse SDE of a diffusion process that transforms the target distribution to the standard Gaussian, utilizing a non-parametric score estimation.

Tractable MCMC for Private Learning with Pure and Gaussian Differential Privacy

no code implementations23 Oct 2023 Yingyu Lin, Yian Ma, Yu-Xiang Wang, Rachel Redberg

Posterior sampling, i. e., exponential mechanism to sample from the posterior distribution, provides $\varepsilon$-pure differential privacy (DP) guarantees and does not suffer from potentially unbounded privacy breach introduced by $(\varepsilon,\delta)$-approximate DP.

Linear Convergence of Black-Box Variational Inference: Should We Stick the Landing?

no code implementations27 Jul 2023 Kyurae Kim, Yian Ma, Jacob R. Gardner

We prove that black-box variational inference (BBVI) with control variates, particularly the sticking-the-landing (STL) estimator, converges at a geometric (traditionally called "linear") rate under perfect variational family specification.

Variational Inference

Disentangled Multi-Fidelity Deep Bayesian Active Learning

1 code implementation7 May 2023 Dongxia Wu, Ruijia Niu, Matteo Chinazzi, Yian Ma, Rose Yu

To balance quality and cost, various domain areas of science and engineering run simulations at multiple levels of sophistication.

Active Learning Gaussian Processes

On Optimal Early Stopping: Overparametrization versus Underparametrization

no code implementations29 Sep 2021 Ruoqi Shen, Liyao Gao, Yian Ma

Early stopping is a simple and widely used method to prevent over-training neural networks.

Variational Refinement for Importance Sampling Using the Forward Kullback-Leibler Divergence

no code implementations30 Jun 2021 Ghassen Jerfel, Serena Wang, Clara Fannjiang, Katherine A. Heller, Yian Ma, Michael I. Jordan

We thus propose a novel combination of optimization and sampling techniques for approximate Bayesian inference by constructing an IS proposal distribution through the minimization of a forward KL (FKL) divergence.

Bayesian Inference Variational Inference

Langevin Dynamics as Nonparametric Variational Inference

no code implementations pproximateinference AABI Symposium 2019 Matthew D. Hoffman, Yian Ma

Variational inference (VI) and Markov chain Monte Carlo (MCMC) are approximate posterior inference algorithms that are often said to have complementary strengths, with VI being fast but biased and MCMC being slower but asymptotically unbiased.

Variational Inference

Cannot find the paper you are looking for? You can Submit a new open access paper.