Search Results for author: Andreas Maurer

Found 16 papers, 1 papers with code

Generalization for slowly mixing processes

no code implementations28 Apr 2023 Andreas Maurer

A bound uniform over various loss-classes is given for data generated by stationary and phi-mixing processes, where the mixing time (the time needed to obtain approximate independence) enters the sample complexity only in an additive way.

Concentration inequalities under sub-Gaussian and sub-exponential conditions

no code implementations NeurIPS 2021 Andreas Maurer, Massimiliano Pontil

We prove analogues of the popular bounded difference inequality (also called McDiarmid's inequality) for functions of independent random variables under sub-gaussian and sub-exponential conditions.

regression

Some Hoeffding- and Bernstein-type Concentration Inequalities

no code implementations11 Feb 2021 Andreas Maurer, Massimiliano Pontil

We prove concentration inequalities for functions of independent random variables {under} sub-gaussian and sub-exponential conditions.

Vocal Bursts Type Prediction

Robust Unsupervised Learning via L-Statistic Minimization

no code implementations14 Dec 2020 Andreas Maurer, Daniela A. Parletta, Andrea Paudice, Massimiliano Pontil

Designing learning algorithms that are resistant to perturbations of the underlying data distribution is a problem of wide practical and theoretical importance.

Clustering

Estimating weighted areas under the ROC curve

no code implementations NeurIPS 2020 Andreas Maurer, Massimiliano Pontil

Exponential bounds on the estimation error are given for the plug-in estimator of weighted areas under the ROC curve.

Learning Fair and Transferable Representations

no code implementations NeurIPS 2020 Luca Oneto, Michele Donini, Andreas Maurer, Massimiliano Pontil

Developing learning methods which do not discriminate subgroups in the population is a central goal of algorithmic fairness.

Fairness

Uniform concentration and symmetrization for weak interactions

no code implementations5 Feb 2019 Andreas Maurer, Massimiliano Pontil

The method to derive uniform bounds with Gaussian and Rademacher complexities is extended to the case where the sample average is replaced by a nonlinear statistic.

Empirical bounds for functions with weak interactions

no code implementations11 Mar 2018 Andreas Maurer, Massimiliano Pontil

We provide sharp empirical estimates of expectation, variance and normal approximation for a class of statistics whose variation in any argument does not change too much when another argument is modified.

Bounds for Vector-Valued Function Estimation

no code implementations5 Jun 2016 Andreas Maurer, Massimiliano Pontil

Multi-task learning and one-vs-all multi-category learning are treated as examples.

Multi-Task Learning

A vector-contraction inequality for Rademacher complexities

no code implementations1 May 2016 Andreas Maurer

The contraction inequality for Rademacher averages is extended to Lipschitz functions with vector-valued domains, and it is also shown that in the bounding expression the Rademacher variables can be replaced by arbitrary iid symmetric and sub-gaussian variables.

Clustering

The Benefit of Multitask Representation Learning

no code implementations23 May 2015 Andreas Maurer, Massimiliano Pontil, Bernardino Romera-Paredes

In particular, focusing on the important example of half-space learning, we derive the regime in which multitask representation learning is beneficial over independent task learning, as a function of the sample size, the number of tasks and the intrinsic data dimensionality.

Representation Learning

A chain rule for the expected suprema of Gaussian processes

no code implementations10 Nov 2014 Andreas Maurer

The expected supremum of a Gaussian process indexed by the image of an index set under a function class is bounded in terms of separate properties of the index set and the function class.

Gaussian Processes

Sparse coding for multitask and transfer learning

no code implementations4 Sep 2012 Andreas Maurer, Massimiliano Pontil, Bernardino Romera-Paredes

We investigate the use of sparse coding and dictionary learning in the context of multitask and transfer learning.

Dictionary Learning Transfer Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.