Search Results for author: Yian Ma

Found 10 papers, 1 papers with code

ClimaQA: An Automated Evaluation Framework for Climate Foundation Models

no code implementations22 Oct 2024 Veeramakali Vignesh Manivannan, Yasaman Jafari, Srikar Eranky, Spencer Ho, Rose Yu, Duncan Watson-Parris, Yian Ma, Leon Bergen, Taylor Berg-Kirkpatrick

However, a critical issue remains: the lack of a comprehensive evaluation framework capable of assessing the quality and scientific validity of model outputs.

Accuracy on the wrong line: On the pitfalls of noisy data for out-of-distribution generalisation

no code implementations27 Jun 2024 Amartya Sanyal, Yaxi Hu, Yaodong Yu, Yian Ma, Yixin Wang, Bernhard Schölkopf

"Accuracy-on-the-line" is a widely observed phenomenon in machine learning, where a model's accuracy on in-distribution (ID) and out-of-distribution (OOD) data is positively correlated across different hyperparameters and data configurations.

Faster Sampling without Isoperimetry via Diffusion-based Monte Carlo

no code implementations12 Jan 2024 Xunpeng Huang, Difan Zou, Hanze Dong, Yian Ma, Tong Zhang

Specifically, DMC follows the reverse SDE of a diffusion process that transforms the target distribution to the standard Gaussian, utilizing a non-parametric score estimation.

Linear Convergence of Black-Box Variational Inference: Should We Stick the Landing?

no code implementations27 Jul 2023 Kyurae Kim, Yian Ma, Jacob R. Gardner

We prove that black-box variational inference (BBVI) with control variates, particularly the sticking-the-landing (STL) estimator, converges at a geometric (traditionally called "linear") rate under perfect variational family specification.

Variational Inference

Disentangled Multi-Fidelity Deep Bayesian Active Learning

1 code implementation7 May 2023 Dongxia Wu, Ruijia Niu, Matteo Chinazzi, Yian Ma, Rose Yu

To balance quality and cost, various domain areas of science and engineering run simulations at multiple levels of sophistication.

Active Learning Gaussian Processes

On Optimal Early Stopping: Overparametrization versus Underparametrization

no code implementations29 Sep 2021 Ruoqi Shen, Liyao Gao, Yian Ma

Early stopping is a simple and widely used method to prevent over-training neural networks.

Variational Refinement for Importance Sampling Using the Forward Kullback-Leibler Divergence

no code implementations30 Jun 2021 Ghassen Jerfel, Serena Wang, Clara Fannjiang, Katherine A. Heller, Yian Ma, Michael I. Jordan

We thus propose a novel combination of optimization and sampling techniques for approximate Bayesian inference by constructing an IS proposal distribution through the minimization of a forward KL (FKL) divergence.

Bayesian Inference Variational Inference

Langevin Dynamics as Nonparametric Variational Inference

no code implementations pproximateinference AABI Symposium 2019 Matthew D. Hoffman, Yian Ma

Variational inference (VI) and Markov chain Monte Carlo (MCMC) are approximate posterior inference algorithms that are often said to have complementary strengths, with VI being fast but biased and MCMC being slower but asymptotically unbiased.

Variational Inference

Cannot find the paper you are looking for? You can Submit a new open access paper.