1 code implementation • 17 Jan 2024 • Nicolas Garcia Trillos, Matt Jacobs, Jakwang Kim, Matthew Werenski
Recent works have developed a connection between AT in the multiclass classification setting and multimarginal optimal transport (MOT), unlocking a new set of tools to study this problem.
no code implementations • 13 Dec 2023 • Nicolas Garcia Trillos, Bodhisattva Sen
We then prove that, under appropriate identifiability assumptions on the model, our OT-based denoiser can be recovered solely from information of the marginal distribution of $Z$ and the posterior mean of the model, after solving a linear relaxation problem over a suitable space of couplings that is reminiscent of a standard multimarginal OT (MOT) problem.
no code implementations • 1 Oct 2023 • Chenghui Li, Rishi Sonthalia, Nicolas Garcia Trillos
There is a large variety of machine learning methodologies that are based on the extraction of spectral geometric information from data.
no code implementations • 5 Jul 2023 • Nicolas Garcia Trillos, Melanie Weber
Let $\mathcal{M} \subseteq \mathbb{R}^d$ denote a low-dimensional manifold and let $\mathcal{X}= \{ x_1, \dots, x_n \}$ be a collection of points uniformly sampled from $\mathcal{M}$.
1 code implementation • 4 May 2023 • Jose A. Carrillo, Nicolas Garcia Trillos, Sixu Li, Yuhua Zhu
Federated learning is an important framework in modern machine learning that seeks to integrate the training of learning models from multiple users, each user having their own local data set, in a way that is sensitive to data privacy and to communication loss constraints.
no code implementations • 28 Apr 2023 • Nicolas Garcia Trillos, Matt Jacobs, Jakwang Kim
We study three models of the problem of adversarial training in multiclass classification designed to construct robust classifiers against adversarial perturbations of data in the agnostic-classifier setting.
no code implementations • 9 Jan 2023 • Camilo Garcia Trillos, Nicolas Garcia Trillos
These interacting particle dynamics are shown to converge toward appropriate mean-field limit equations in certain large number of particles regimes.
no code implementations • 29 Sep 2022 • Yuetian Luo, Nicolas Garcia Trillos
To prove our results we provide a comprehensive landscape analysis of a matrix factorization problem with a least squares objective, which serves as a critical bridge.
1 code implementation • 27 Apr 2022 • Nicolas Garcia Trillos, Matt Jacobs, Jakwang Kim
We study a family of adversarial multiclass classification problems and provide equivalent reformulations in terms of: 1) a family of generalized barycenter problems introduced in the paper and 2) a family of multimarginal optimal transport problems where the number of marginals is equal to the number of classes in the original classification problem.
no code implementations • 13 Sep 2021 • Camilo Garcia Trillos, Nicolas Garcia Trillos
In this paper we explore the relation between distributionally robust learning and different forms of regularization to enforce robustness of deep neural networks.
1 code implementation • NeurIPS 2023 • Nicolas Garcia Trillos, Pengfei He, Chenghui Li
We investigate sufficient conditions that similarity graphs on data sets must satisfy in order for their corresponding graph Laplacians to capture the right geometric information to solve the MMC problem.
no code implementations • 21 Nov 2020 • Nicolas Garcia Trillos, Ryan Murray
Using the necessary conditions, we derive a geometric evolution equation which can be used to track the change in classification boundaries as $\varepsilon$ varies.
no code implementations • 13 Jul 2020 • Jeff Calder, Nicolas Garcia Trillos, Marta Lewicka
As a byproduct of our general regularity results, we obtain high probability $L^\infty$ and approximate $\mathcal{C}^{0, 1}$ convergence rates for the convergence of graph Laplacian eigenvectors towards eigenfunctions of the corresponding weighted Laplace-Beltrami operators.
1 code implementation • 26 Jun 2020 • Nicolas Garcia Trillos, Felix Morales, Javier Morales
In this paper we introduce two algorithms for neural architecture search (NASGD and NASAGD) following the theoretical work by two of the authors [5] which used the geometric structure of optimal transport to introduce the conceptual basis for new notions of traditional and accelerated gradient descent algorithms for the optimization of a function on a semi-discrete space.
1 code implementation • 26 Jun 2020 • Nicolas Garcia Trillos, Javier Morales
With this aim in mind, we discuss the geometric and theoretical motivation for new techniques for neural architecture search (in a companion paper we show that algorithms inspired by our framework are competitive with contemporaneous methods).
no code implementations • 20 Apr 2020 • Nicolas Garcia Trillos, Ryan Murray, Matthew Thorpe
In this work we study statistical properties of graph-based clustering algorithms that rely on the optimization of balanced graph cuts, the main example being the optimization of Cheeger cuts.
no code implementations • 29 Oct 2019 • Jeff Calder, Nicolas Garcia Trillos
In this paper we improve the spectral convergence rates for graph-based approximations of Laplace-Beltrami operators constructed from random data.
no code implementations • 6 Apr 2019 • Nicolas Garcia Trillos, Daniel Sanz-Alonso, Ruiyi Yang
Several data analysis techniques employ similarity relationships between data points to uncover the intrinsic dimension and geometric structure of the underlying data-generating mechanism.
no code implementations • 30 Jan 2019 • Nicolas Garcia Trillos, Franca Hoffmann, Bamdad Hosseini
More precisely, we assume that the data is sampled from a mixture model supported on a manifold $\mathcal{M}$ embedded in $\mathbb{R}^d$, and pick a connectivity length-scale $\varepsilon>0$ to construct a kernelized graph Laplacian.
no code implementations • 29 Jan 2019 • Nicolas Garcia Trillos, Zach Kaplan, Daniel Sanz-Alonso
The aim of this paper is to provide new theoretical and computational understanding on two loss regularizations employed in deep learning, known as local entropy and heat regularization.
no code implementations • 29 Jan 2019 • Nicolas Garcia Trillos, Ryan Murray
This paper investigates the use of methods from partial differential equations and the Calculus of variations to study learning problems that are regularized using graph Laplacians.
no code implementations • 30 Jan 2018 • Nicolas Garcia Trillos, Moritz Gerlach, Matthias Hein, Dejan Slepcev
sample from a $m$-dimensional submanifold $M$ in $R^d$ as the sample size $n$ increases and the neighborhood size $h$ tends to zero.
no code implementations • 20 Oct 2017 • Nicolas Garcia Trillos, Zachary Kaplan, Thabo Samakhoana, Daniel Sanz-Alonso
A popular approach to semi-supervised learning proceeds by endowing the input data with a graph structure in order to extract geometric information and incorporate it into a Bayesian framework.
no code implementations • 22 Jun 2017 • Nicolas Garcia Trillos, Daniel Sanz-Alonso
We consider the problem of recovering a function input of a differential equation formulated on an unknown domain $M$.
no code implementations • 11 Feb 2017 • Nicolas Garcia Trillos
We consider a point cloud $X_n := \{ x_1, \dots, x_n \}$ uniformly distributed on the flat torus $\mathbb{T}^d : = \mathbb{R}^d / \mathbb{Z}^d $, and construct a geometric graph on the cloud by connecting points that are within distance $\varepsilon$ of each other.
no code implementations • 3 Jul 2016 • Nicolas Garcia Trillos
This paper studies the large sample asymptotics of data analysis procedures based on the optimization of functionals defined on $k$-NN graphs on point clouds.
no code implementations • 1 Jul 2016 • Nicolas Garcia Trillos, Ryan Murray
This work considers the problem of binary classification: given training data $x_1, \dots, x_n$ from a certain population, together with associated labels $y_1,\dots, y_n \in \left\{0, 1 \right\}$, determine the best label for an element $x$ not among the training data.
no code implementations • 24 Nov 2014 • Nicolas Garcia Trillos, Dejan Slepcev, James Von Brecht, Thomas Laurent, Xavier Bresson
We consider point clouds obtained as samples of a ground-truth measure.