Search Results for author: Thibaut Le Gouic

Found 12 papers, 1 papers with code

An algorithmic solution to the Blotto game using multi-marginal couplings

no code implementations15 Feb 2022 Vianney Perchet, Philippe Rigollet, Thibaut Le Gouic

In the case of asymmetric values where optimal solutions need not exist but Nash equilibria do, our algorithm samples from an $\varepsilon$-Nash equilibrium with similar complexity but where implicit constants depend on various parameters of the game such as battlefield values.

A probabilistic model for fast-to-evaluate 2D crack path prediction in heterogeneous materials

no code implementations27 Dec 2021 Kathleen Pele, Jean Baccou, Loïc Daridon, Jacques Liandrat, Thibaut Le Gouic, Yann Monerie, Frédéric Péralès

This paper is devoted to the construction of a new fast-to-evaluate model for the prediction of 2D crack paths in concrete-like microstructures.

Rejection sampling from shape-constrained distributions in sublinear time

no code implementations29 May 2021 Sinho Chewi, Patrik Gerber, Chen Lu, Thibaut Le Gouic, Philippe Rigollet

We consider the task of generating exact samples from a target distribution, known up to normalization, over a finite alphabet.

The query complexity of sampling from strongly log-concave distributions in one dimension

no code implementations29 May 2021 Sinho Chewi, Patrik Gerber, Chen Lu, Thibaut Le Gouic, Philippe Rigollet

We establish the first tight lower bound of $\Omega(\log\log\kappa)$ on the query complexity of sampling from the class of strongly log-concave and log-smooth distributions with condition number $\kappa$ in one dimension.

Sampling From the Wasserstein Barycenter

no code implementations4 May 2021 Chiheb Daaloul, Thibaut Le Gouic, Jacques Liandrat, Magali Tournus

Our method is based on the gradient flow of the multimarginal formulation of the Wasserstein barycenter, with an additive penalization to account for the marginal constraints.

Optimal dimension dependence of the Metropolis-Adjusted Langevin Algorithm

no code implementations23 Dec 2020 Sinho Chewi, Chen Lu, Kwangjun Ahn, Xiang Cheng, Thibaut Le Gouic, Philippe Rigollet

Conventional wisdom in the sampling literature, backed by a popular diffusion scaling limit, suggests that the mixing time of the Metropolis-Adjusted Langevin Algorithm (MALA) scales as $O(d^{1/3})$, where $d$ is the dimension.

SVGD as a kernelized Wasserstein gradient flow of the chi-squared divergence

1 code implementation NeurIPS 2020 Sinho Chewi, Thibaut Le Gouic, Chen Lu, Tyler Maunu, Philippe Rigollet

Stein Variational Gradient Descent (SVGD), a popular sampling algorithm, is often described as the kernelized gradient flow for the Kullback-Leibler divergence in the geometry of optimal transport.

Projection to Fairness in Statistical Learning

no code implementations24 May 2020 Thibaut Le Gouic, Jean-Michel Loubes, Philippe Rigollet

In the context of regression, we consider the fundamental question of making an estimator fair while preserving its prediction accuracy as much as possible.

Fairness regression

Exponential ergodicity of mirror-Langevin diffusions

no code implementations NeurIPS 2020 Sinho Chewi, Thibaut Le Gouic, Chen Lu, Tyler Maunu, Philippe Rigollet, Austin J. Stromme

Motivated by the problem of sampling from ill-conditioned log-concave distributions, we give a clean non-asymptotic convergence analysis of mirror-Langevin diffusions as introduced in Zhang et al. (2020).

A notion of stability for k-means clustering

no code implementations29 Jan 2018 Thibaut Le Gouic, Quentin Paris

In this paper, we define and study a new notion of stability for the $k$-means clustering scheme building upon the notion of quantization of a probability measure.

Clustering Quantization

Mass localization

no code implementations12 Jun 2015 Thibaut Le Gouic

For a given class $\mathcal{F}$ of closed sets of a measured metric space $(E, d,\mu)$, we want to find the smallest element $B$ of the class $\mathcal{F}$ such that $\mu(B)\geq 1-\alpha$, for a given $0<\alpha<1$.

Statistics Theory Statistics Theory

Recovering metric from full ordinal information

no code implementations11 Jun 2015 Thibaut Le Gouic

Given a geodesic space (E, d), we show that full ordinal knowledge on the metric d-i. e. knowledge of the function D d : (w, x, y, z) $\rightarrow$ 1 d(w, x)$\le$d(y, z) , determines uniquely-up to a constant factor-the metric d. For a subspace En of n points of E, converging in Hausdorff distance to E, we construct a metric dn on En, based only on the knowledge of D d on En and establish a sharp upper bound of the Gromov-Hausdorff distance between (En, dn) and (E, d).

Cannot find the paper you are looking for? You can Submit a new open access paper.