no code implementations • 17 Mar 2022 • Tyler Maunu, Chenyu Yu, Gilad Lerman
Our results emphasize the advantages of the nonconvex methods over another convex approach to solving this problem in the differentially private setting.
1 code implementation • 13 Jan 2022 • Yunpeng Shi, Shaohan Li, Tyler Maunu, Gilad Lerman
We develop new statistics for robustly filtering corrupted keypoint matches in the structure from motion pipeline.
no code implementations • NeurIPS 2021 • Max Daniels, Tyler Maunu, Paul Hand
We consider the fundamental problem of sampling the optimal transport coupling between given source and target distributions.
no code implementations • NeurIPS 2020 • Sinho Chewi, Thibaut Le Gouic, Chen Lu, Tyler Maunu, Philippe Rigollet
Stein Variational Gradient Descent (SVGD), a popular sampling algorithm, is often described as the kernelized gradient flow for the Kullback-Leibler divergence in the geometry of optimal transport.
no code implementations • NeurIPS 2020 • Sinho Chewi, Thibaut Le Gouic, Chen Lu, Tyler Maunu, Philippe Rigollet, Austin J. Stromme
Motivated by the problem of sampling from ill-conditioned log-concave distributions, we give a clean non-asymptotic convergence analysis of mirror-Langevin diffusions as introduced in Zhang et al. (2020).
no code implementations • 13 Feb 2020 • Tyler Maunu, Gilad Lerman
We give robust recovery results for synchronization on the rotation group, $\mathrm{SO}(D)$.
no code implementations • 5 Apr 2019 • Tyler Maunu, Gilad Lerman
The two estimators are fast to compute and achieve state-of-the-art theoretical performance in a noiseless RSR setting with adversarial outliers.
no code implementations • 2 Mar 2018 • Gilad Lerman, Tyler Maunu
This paper will serve as an introduction to the body of work on robust subspace recovery.
no code implementations • 13 Jun 2017 • Tyler Maunu, Teng Zhang, Gilad Lerman
The practicality of the deterministic condition is demonstrated on some statistical models of data, and the method achieves almost state-of-the-art recovery guarantees on the Haystack Model for different regimes of sample size and ambient dimension.
no code implementations • 24 Jun 2014 • Gilad Lerman, Tyler Maunu
Further, under a special model of data, FMS converges to a point which is near to the global minimum with overwhelming probability.