1 code implementation • 9 Feb 2024 • Yuhao Wang, Ming Gao, Wai Ming Tai, Bryon Aragam, Arnab Bhattacharyya
We develop optimal algorithms for learning undirected Gaussian trees and directed Gaussian polytrees from data.
no code implementations • 28 Dec 2023 • Zhao Lyu, Wai Ming Tai, Mladen Kolar, Bryon Aragam
In this paper, we highlight the inherent limitations of cross-validation when employed to discern the structure of a Gaussian graphical model.
no code implementations • 8 Nov 2023 • Benwei Shi, Aditya Bhaskara, Wai Ming Tai, Jeff M. Phillips
We show that a constant-size constant-error coreset for polytope distance is simple to maintain under merges of coresets.
no code implementations • 6 May 2023 • Wai Ming Tai, Bryon Aragam
We study the problem of learning mixtures of Gaussians with censored data.
no code implementations • 28 Mar 2022 • Bryon Aragam, Wai Ming Tai
Combining these bounds, we conclude that the optimal sample complexity of this problem properly lies in between polynomial and exponential, which is not common in learning theory.
1 code implementation • 25 Jan 2022 • Ming Gao, Wai Ming Tai, Bryon Aragam
In other words, at least for Gaussian models with equal error variances, learning a directed graphical model is statistically no more difficult than learning an undirected graphical model.
no code implementations • 15 Jul 2020 • Wai Ming Tai
We study how to construct a small subset $Q$ of $P$ such that the kernel density estimate of $P$ is approximated by the kernel density estimate of $Q$.
no code implementations • 28 May 2019 • Aditya Bhaskara, Wai Ming Tai
The problem is formalized as factorizing a matrix $X (d \times n)$ (whose columns are the signals) as $X = AY$, where $A$ has a prescribed number $m$ of columns (typically $m \ll n$), and $Y$ has columns that are $k$-sparse (typically $k \ll d$).
no code implementations • 9 Nov 2018 • Jeff M. Phillips, Wai Ming Tai
We introduce two versions of a new sketch for approximately embedding the Gaussian kernel into Euclidean inner product space.
no code implementations • 6 Feb 2018 • Jeff M. Phillips, Wai Ming Tai
When $d\geq 1/\varepsilon^2$, it is known that the size of coreset can be $O(1/\varepsilon^2)$.
no code implementations • 11 Oct 2017 • Jeff M. Phillips, Wai Ming Tai
When the dimension $d$ is constant, we demonstrate much tighter bounds on the size of the coreset specifically for Gaussian kernels, showing that it is bounded by the size of the coreset for axis-aligned rectangles.