no code implementations • 15 Feb 2022 • Vianney Perchet, Philippe Rigollet, Thibaut Le Gouic
In the case of asymmetric values where optimal solutions need not exist but Nash equilibria do, our algorithm samples from an $\varepsilon$-Nash equilibrium with similar complexity but where implicit constants depend on various parameters of the game such as battlefield values.
no code implementations • 27 Dec 2021 • Kathleen Pele, Jean Baccou, Loïc Daridon, Jacques Liandrat, Thibaut Le Gouic, Yann Monerie, Frédéric Péralès
This paper is devoted to the construction of a new fast-to-evaluate model for the prediction of 2D crack paths in concrete-like microstructures.
no code implementations • 29 May 2021 • Sinho Chewi, Patrik Gerber, Chen Lu, Thibaut Le Gouic, Philippe Rigollet
We consider the task of generating exact samples from a target distribution, known up to normalization, over a finite alphabet.
no code implementations • 29 May 2021 • Sinho Chewi, Patrik Gerber, Chen Lu, Thibaut Le Gouic, Philippe Rigollet
We establish the first tight lower bound of $\Omega(\log\log\kappa)$ on the query complexity of sampling from the class of strongly log-concave and log-smooth distributions with condition number $\kappa$ in one dimension.
no code implementations • 4 May 2021 • Chiheb Daaloul, Thibaut Le Gouic, Jacques Liandrat, Magali Tournus
Our method is based on the gradient flow of the multimarginal formulation of the Wasserstein barycenter, with an additive penalization to account for the marginal constraints.
no code implementations • 23 Dec 2020 • Sinho Chewi, Chen Lu, Kwangjun Ahn, Xiang Cheng, Thibaut Le Gouic, Philippe Rigollet
Conventional wisdom in the sampling literature, backed by a popular diffusion scaling limit, suggests that the mixing time of the Metropolis-Adjusted Langevin Algorithm (MALA) scales as $O(d^{1/3})$, where $d$ is the dimension.
1 code implementation • NeurIPS 2020 • Sinho Chewi, Thibaut Le Gouic, Chen Lu, Tyler Maunu, Philippe Rigollet
Stein Variational Gradient Descent (SVGD), a popular sampling algorithm, is often described as the kernelized gradient flow for the Kullback-Leibler divergence in the geometry of optimal transport.
no code implementations • 24 May 2020 • Thibaut Le Gouic, Jean-Michel Loubes, Philippe Rigollet
In the context of regression, we consider the fundamental question of making an estimator fair while preserving its prediction accuracy as much as possible.
no code implementations • NeurIPS 2020 • Sinho Chewi, Thibaut Le Gouic, Chen Lu, Tyler Maunu, Philippe Rigollet, Austin J. Stromme
Motivated by the problem of sampling from ill-conditioned log-concave distributions, we give a clean non-asymptotic convergence analysis of mirror-Langevin diffusions as introduced in Zhang et al. (2020).
no code implementations • 29 Jan 2018 • Thibaut Le Gouic, Quentin Paris
In this paper, we define and study a new notion of stability for the $k$-means clustering scheme building upon the notion of quantization of a probability measure.
no code implementations • 12 Jun 2015 • Thibaut Le Gouic
For a given class $\mathcal{F}$ of closed sets of a measured metric space $(E, d,\mu)$, we want to find the smallest element $B$ of the class $\mathcal{F}$ such that $\mu(B)\geq 1-\alpha$, for a given $0<\alpha<1$.
Statistics Theory Statistics Theory
no code implementations • 11 Jun 2015 • Thibaut Le Gouic
Given a geodesic space (E, d), we show that full ordinal knowledge on the metric d-i. e. knowledge of the function D d : (w, x, y, z) $\rightarrow$ 1 d(w, x)$\le$d(y, z) , determines uniquely-up to a constant factor-the metric d. For a subspace En of n points of E, converging in Hausdorff distance to E, we construct a metric dn on En, based only on the knowledge of D d on En and establish a sharp upper bound of the Gromov-Hausdorff distance between (En, dn) and (E, d).