Search Results for author: Christophe Giraud

Found 11 papers, 3 papers with code

Estimating the history of a random recursive tree

no code implementations14 Mar 2024 Simon Briend, Christophe Giraud, Gábor Lugosi, Déborah Sulem

This paper studies the problem of estimating the order of arrival of the vertices in a random recursive tree.

Parameter-free projected gradient descent

no code implementations31 May 2023 Evgenii Chzhen, Christophe Giraud, Gilles Stoltz

We consider the problem of minimizing a convex function over a closed convex set, with Projected Gradient Descent (PGD).

Stochastic Optimization

The price of unfairness in linear bandits with biased feedback

no code implementations18 Mar 2022 Solenne Gaucher, Alexandra Carpentier, Christophe Giraud

We also derive gap-dependent upper bounds on the regret, and matching lower bounds for some problem instance. Interestingly, these results reveal a transition between a regime where the problem is as difficult as its unbiased counterpart, and a regime where it can be much harder.

Attribute Decision Making

Training Integrable Parameterizations of Deep Neural Networks in the Infinite-Width Limit

1 code implementation29 Oct 2021 Karl Hajjar, Lénaïc Chizat, Christophe Giraud

For two-layer neural networks, it has been understood via these asymptotics that the nature of the trained model radically changes depending on the scale of the initial random weights, ranging from a kernel regime (for large initial variance) to a feature learning regime (for small initial variance).

Image Classification

Localization in 1D non-parametric latent space models from pairwise affinities

no code implementations6 Aug 2021 Christophe Giraud, Yann Issartel, Nicolas Verzelen

We consider the problem of estimating latent positions in a one-dimensional torus from pairwise affinities.

A Unified Approach to Fair Online Learning via Blackwell Approachability

no code implementations NeurIPS 2021 Evgenii Chzhen, Christophe Giraud, Gilles Stoltz

We provide a setting and a general approach to fair online learning with stochastic sensitive and non-sensitive contexts.

Fairness

Pair-Matching: Links Prediction with Adaptive Queries

no code implementations17 May 2019 Christophe Giraud, Yann Issartel, Luc Lehéricy, Matthieu Lerasle

This paper shows that sublinear regret is achievable in the case where the graph is generated according to a Stochastic Block Model (SBM) with two communities.

Community Detection Stochastic Block Model

Partial recovery bounds for clustering with the relaxed $K$means

no code implementations19 Jul 2018 Christophe Giraud, Nicolas Verzelen

We investigate the clustering performances of the relaxed $K$means in the setting of sub-Gaussian Mixture Model (sGMM) and Stochastic Block Model (SBM).

Clustering Stochastic Block Model

PECOK: a convex optimization approach to variable clustering

1 code implementation16 Jun 2016 Florentina Bunea, Christophe Giraud, Martin Royer, Nicolas Verzelen

The problem of variable clustering is that of grouping similar components of a $p$-dimensional vector $X=(X_{1},\ldots, X_{p})$, and estimating these groups from $n$ independent copies of $X$.

Statistics Theory Statistics Theory

Model Assisted Variable Clustering: Minimax-optimal Recovery and Algorithms

1 code implementation8 Aug 2015 Florentina Bunea, Christophe Giraud, Xi Luo, Martin Royer, Nicolas Verzelen

We quantify the difficulty of clustering data generated from a G-block covariance model in terms of cluster proximity, measured with respect to two related, but different, cluster separation metrics.

Clustering

Aggregation of predictors for nonstationary sub-linear processes and online adaptive forecasting of time varying autoregressive processes

no code implementations27 Apr 2014 Christophe Giraud, François Roueff, Andres Sanchez-Perez

It is obtained by aggregating a finite number of well chosen predictors, each of them enjoying an optimal minimax convergence rate under specific smoothness conditions on the TVAR coefficients.

Cannot find the paper you are looking for? You can Submit a new open access paper.