Search Results for author: Jonathan Niles-Weed

Found 20 papers, 2 papers with code

Supervised Quantile Normalization for Low Rank Matrix Factorization

no code implementations ICML 2020 Marco Cuturi, Olivier Teboul, Jonathan Niles-Weed, Jean-Philippe Vert

Low rank matrix factorization is a fundamental building block in machine learning, used for instance to summarize gene expression profile data or word-document counts.

Learning Costs for Structured Monge Displacements

no code implementations20 Jun 2023 Michal Klein, Aram-Alexandre Pooladian, Pierre Ablin, Eugène Ndiaye, Jonathan Niles-Weed, Marco Cuturi

Because of such difficulties, existing approaches rarely depart from the default choice of estimating such maps with the simple squared-Euclidean distance as the ground cost, $c(x, y)=\|x-y\|^2_2$.

Minimax estimation of discontinuous optimal transport maps: The semi-discrete case

no code implementations26 Jan 2023 Aram-Alexandre Pooladian, Vincent Divol, Jonathan Niles-Weed

We consider the problem of estimating the optimal transport map between two probability distributions, $P$ and $Q$ in $\mathbb R^d$, on the basis of i. i. d.

Optimal transport map estimation in general function spaces

no code implementations7 Dec 2022 Vincent Divol, Jonathan Niles-Weed, Aram-Alexandre Pooladian

To ensure identifiability, we assume that $T = \nabla \varphi_0$ is the gradient of a convex function, in which case $T$ is known as an \emph{optimal transport map}.

Perturbation Analysis of Neural Collapse

no code implementations29 Oct 2022 Tom Tirer, Haoxiang Huang, Jonathan Niles-Weed

In this paper, we propose a richer model that can capture this phenomenon by forcing the features to stay in the vicinity of a predefined features matrix (e. g., intermediate features).

Existence and Minimax Theorems for Adversarial Surrogate Risks in Binary Classification

no code implementations18 Jun 2022 Natalie S. Frank, Jonathan Niles-Weed

Adversarial training is one of the most popular methods for training methods robust to adversarial attacks, however, it is not well-understood from a theoretical perspective.

Adversarial Robustness Binary Classification

An improved central limit theorem and fast convergence rates for entropic transportation costs

no code implementations19 Apr 2022 Eustasio del Barrio, Alberto Gonzalez-Sanz, Jean-Michel Loubes, Jonathan Niles-Weed

We prove a central limit theorem for the entropic transportation cost between subgaussian probability measures, centered at the population cost.

valid

Deep Probability Estimation

no code implementations21 Nov 2021 Sheng Liu, Aakash Kaku, Weicheng Zhu, Matan Leibovich, Sreyas Mohan, Boyang Yu, Haoxiang Huang, Laure Zanna, Narges Razavian, Jonathan Niles-Weed, Carlos Fernandez-Granda

Reliable probability estimation is of crucial importance in many real-world applications where there is inherent (aleatoric) uncertainty.

Autonomous Vehicles Binary Classification +2

Entropic estimation of optimal transport maps

no code implementations24 Sep 2021 Aram-Alexandre Pooladian, Jonathan Niles-Weed

We develop a computationally tractable method for estimating the optimal map between two distributions over $\mathbb{R}^d$ with rigorous finite-sample guarantees.

Plugin Estimation of Smooth Optimal Transport Maps

1 code implementation26 Jul 2021 Tudor Manole, Sivaraman Balakrishnan, Jonathan Niles-Weed, Larry Wasserman

Our work also provides new bounds on the risk of corresponding plugin estimators for the quadratic Wasserstein distance, and we show how this problem relates to that of estimating optimal transport maps using stability arguments for smooth and strongly convex Brenier potentials.

It was "all" for "nothing": sharp phase transitions for noiseless discrete channels

no code implementations24 Feb 2021 Jonathan Niles-Weed, Ilias Zadik

We establish a phase transition known as the "all-or-nothing" phenomenon for noiseless discrete channels.

Statistics Theory Information Theory Information Theory Probability Statistics Theory

Streaming k-PCA: Efficient guarantees for Oja's algorithm, beyond rank-one updates

no code implementations6 Feb 2021 De Huang, Jonathan Niles-Weed, Rachel Ward

We analyze Oja's algorithm for streaming $k$-PCA and prove that it achieves performance nearly matching that of an optimal offline algorithm.

The Discrepancy of Random Rectangular Matrices

no code implementations11 Jan 2021 Dylan J. Altschuler, Jonathan Niles-Weed

A recent approach to the Beck-Fiala conjecture, a fundamental problem in combinatorics, has been to understand when random integer matrices have constant discrepancy.

Probability Discrete Mathematics Combinatorics

Early-Learning Regularization Prevents Memorization of Noisy Labels

2 code implementations NeurIPS 2020 Sheng Liu, Jonathan Niles-Weed, Narges Razavian, Carlos Fernandez-Granda

In contrast with existing approaches, which use the model output during early learning to detect the examples with clean labels, and either ignore or attempt to correct the false labels, we take a different route and instead capitalize on early learning via regularization.

General Classification Learning with noisy labels +1

Sinkhorn EM: An Expectation-Maximization algorithm based on entropic optimal transport

no code implementations30 Jun 2020 Gonzalo Mena, Amin Nejatbakhsh, Erdem Varol, Jonathan Niles-Weed

We study Sinkhorn EM (sEM), a variant of the expectation maximization (EM) algorithm for mixtures based on entropic optimal transport.

Supervised Quantile Normalization for Low-rank Matrix Approximation

no code implementations8 Feb 2020 Marco Cuturi, Olivier Teboul, Jonathan Niles-Weed, Jean-Philippe Vert

Low rank matrix factorization is a fundamental building block in machine learning, used for instance to summarize gene expression profile data or word-document counts.

Massively scalable Sinkhorn distances via the Nyström method

no code implementations NeurIPS 2019 Jason Altschuler, Francis Bach, Alessandro Rudi, Jonathan Niles-Weed

The Sinkhorn "distance", a variant of the Wasserstein distance with entropic regularization, is an increasingly popular tool in machine learning and statistical inference.

Cannot find the paper you are looking for? You can Submit a new open access paper.