no code implementations • 27 Feb 2024 • Ye He, Kevin Rojas, Molei Tao
It first describes a framework, Diffusion Monte Carlo (DMC), based on the simulation of a denoising diffusion process with its score function approximated by a generic Monte Carlo estimator.
no code implementations • 3 Apr 2023 • Krishnakumar Balasubramanian, Promit Ghosal, Ye He
We derive high-dimensional scaling limits and fluctuations for the online least-squares Stochastic Gradient Descent (SGD) algorithm by taking the properties of the data generating model explicitly into consideration.
no code implementations • 7 Mar 2023 • Alireza Mousavi-Hosseini, Tyler Farghly, Ye He, Krishnakumar Balasubramanian, Murat A. Erdogdu
We do so by establishing upper and lower bounds for Langevin diffusions and LMC under weak Poincar\'e inequalities that are satisfied by a large class of densities including polynomially-decaying heavy-tailed densities (i. e., Cauchy-type).
no code implementations • 1 Mar 2023 • Ye He, Tyler Farghly, Krishnakumar Balasubramanian, Murat A. Erdogdu
We analyze the complexity of sampling from a class of heavy-tailed distributions by discretizing a natural class of It\^o diffusions associated with weighted Poincar\'e inequalities.
no code implementations • 15 Nov 2022 • Ye He, Krishnakumar Balasubramanian, Bharath K. Sriperumbudur, Jianfeng Lu
In this work, we propose the Regularized Stein Variational Gradient Flow which interpolates between the Stein Variational Gradient Flow and the Wasserstein Gradient Flow.
no code implementations • 20 Jan 2022 • Ye He, Krishnakumar Balasubramanian, Murat A. Erdogdu
We analyze the oracle complexity of sampling from polynomially decaying heavy-tailed target densities based on running the Unadjusted Langevin Algorithm on certain transformed versions of the target density.
no code implementations • NeurIPS 2020 • Ye He, Krishnakumar Balasubramanian, Murat A. Erdogdu
The randomized midpoint method, proposed by [SL19], has emerged as an optimal discretization procedure for simulating the continuous time Langevin diffusions.
no code implementations • 21 Oct 2020 • Ye He, Chao Zhu, Xu-Cheng Yin
These two branches are trained in a mutual-supervised way with full body annotations and visible body annotations, respectively.