no code implementations • 28 Apr 2023 • Andreas Maurer
A bound uniform over various loss-classes is given for data generated by stationary and phi-mixing processes, where the mixing time (the time needed to obtain approximate independence) enters the sample complexity only in an additive way.
1 code implementation • 27 May 2022 • Vladimir Kostic, Pietro Novelli, Andreas Maurer, Carlo Ciliberto, Lorenzo Rosasco, Massimiliano Pontil
We formalize a framework to learn the Koopman operator from finite data trajectories of the dynamical system.
no code implementations • NeurIPS 2021 • Andreas Maurer, Massimiliano Pontil
We prove analogues of the popular bounded difference inequality (also called McDiarmid's inequality) for functions of independent random variables under sub-gaussian and sub-exponential conditions.
no code implementations • 11 Feb 2021 • Andreas Maurer, Massimiliano Pontil
We prove concentration inequalities for functions of independent random variables {under} sub-gaussian and sub-exponential conditions.
no code implementations • 14 Dec 2020 • Andreas Maurer, Daniela A. Parletta, Andrea Paudice, Massimiliano Pontil
Designing learning algorithms that are resistant to perturbations of the underlying data distribution is a problem of wide practical and theoretical importance.
no code implementations • NeurIPS 2020 • Luca Oneto, Michele Donini, Giulia Luise, Carlo Ciliberto, Andreas Maurer, Massimiliano Pontil
One way to reach this goal is by modifying the data representation in order to meet certain fairness constraints.
no code implementations • NeurIPS 2020 • Andreas Maurer, Massimiliano Pontil
Exponential bounds on the estimation error are given for the plug-in estimator of weighted areas under the ROC curve.
no code implementations • NeurIPS 2020 • Luca Oneto, Michele Donini, Andreas Maurer, Massimiliano Pontil
Developing learning methods which do not discriminate subgroups in the population is a central goal of algorithmic fairness.
no code implementations • 5 Feb 2019 • Andreas Maurer, Massimiliano Pontil
The method to derive uniform bounds with Gaussian and Rademacher complexities is extended to the case where the sample average is replaced by a nonlinear statistic.
no code implementations • 11 Mar 2018 • Andreas Maurer, Massimiliano Pontil
We provide sharp empirical estimates of expectation, variance and normal approximation for a class of statistics whose variation in any argument does not change too much when another argument is modified.
no code implementations • 5 Jun 2016 • Andreas Maurer, Massimiliano Pontil
Multi-task learning and one-vs-all multi-category learning are treated as examples.
no code implementations • 1 May 2016 • Andreas Maurer
The contraction inequality for Rademacher averages is extended to Lipschitz functions with vector-valued domains, and it is also shown that in the bounding expression the Rademacher variables can be replaced by arbitrary iid symmetric and sub-gaussian variables.
no code implementations • 23 May 2015 • Andreas Maurer, Massimiliano Pontil, Bernardino Romera-Paredes
In particular, focusing on the important example of half-space learning, we derive the regime in which multitask representation learning is beneficial over independent task learning, as a function of the sample size, the number of tasks and the intrinsic data dimensionality.
no code implementations • 10 Nov 2014 • Andreas Maurer
The expected supremum of a Gaussian process indexed by the image of an index set under a function class is bounded in terms of separate properties of the index set and the function class.
no code implementations • 8 Feb 2014 • Andreas Maurer, Massimiliano Pontil, Bernardino Romera-Paredes
From concentration inequalities for the suprema of Gaussian or Rademacher processes an inequality is derived.
no code implementations • 4 Sep 2012 • Andreas Maurer, Massimiliano Pontil, Bernardino Romera-Paredes
We investigate the use of sparse coding and dictionary learning in the context of multitask and transfer learning.