Search Results for author: Damien Scieur

Found 18 papers, 4 papers with code

Acceleration through spectral density estimation

no code implementations ICML 2020 Fabian Pedregosa, Damien Scieur

We develop a framework for designing optimal optimization methods in terms of their average-case runtime.

Density Estimation regression

Extra-gradient with player sampling for faster convergence in n-player games

no code implementations ICML 2020 Samy Jelassi, Carles Domingo-Enrich, Damien Scieur, Arthur Mensch, Joan Bruna

Data-driven modeling increasingly requires to find a Nash equilibrium in multi-player games, e. g. when training GANs.

Universal Asymptotic Optimality of Polyak Momentum

no code implementations ICML 2020 Damien Scieur, Fabian Pedregosa

We consider the average-case runtime analysis of algorithms for minimizing quadratic objectives.

SING: A Plug-and-Play DNN Learning Technique

1 code implementation25 May 2023 Adrien Courtois, Damien Scieur, Jean-Michel Morel, Pablo Arias, Thomas Eboli

We propose SING (StabIlized and Normalized Gradient), a plug-and-play technique that improves the stability and generalization of the Adam(W) optimizer.

Depth Estimation Image Classification

The Curse of Unrolling: Rate of Differentiating Through Optimization

no code implementations27 Sep 2022 Damien Scieur, Quentin Bertrand, Gauthier Gidel, Fabian Pedregosa

Computing the Jacobian of the solution of an optimization problem is a central problem in machine learning, with applications in hyperparameter optimization, meta-learning, optimization as a layer, and dataset distillation, to name a few.

Hyperparameter Optimization Meta-Learning +1

Only Tails Matter: Average-Case Universality and Robustness in the Convex Regime

no code implementations20 Jun 2022 Leonardo Cunha, Gauthier Gidel, Fabian Pedregosa, Damien Scieur, Courtney Paquette

The recently developed average-case analysis of optimization methods allows a more fine-grained and representative convergence analysis than usual worst-case results.

Convergence Rates for the MAP of an Exponential Family and Stochastic Mirror Descent -- an Open Problem

no code implementations12 Nov 2021 Rémi Le Priol, Frederik Kunstner, Damien Scieur, Simon Lacoste-Julien

We consider the problem of upper bounding the expected log-likelihood sub-optimality of the maximum likelihood estimate (MLE), or a conjugate maximum a posteriori (MAP) for an exponential family, in a non-asymptotic way.

Connecting Sphere Manifolds Hierarchically for Regularization

no code implementations25 Jun 2021 Damien Scieur, Youngsung Kim

This paper considers classification problems with hierarchically organized classes.

Acceleration Methods

1 code implementation23 Jan 2021 Alexandre d'Aspremont, Damien Scieur, Adrien Taylor

This monograph covers some recent advances in a range of acceleration techniques frequently used in convex optimization.

Generalization of Quasi-Newton Methods: Application to Robust Symmetric Multisecant Updates

no code implementations6 Nov 2020 Damien Scieur, Lewis Liu, Thomas Pumir, Nicolas Boumal

Quasi-Newton techniques approximate the Newton step by estimating the Hessian using the so-called secant equations.

Optimization and Control Numerical Analysis Numerical Analysis

Average-case Acceleration for Bilinear Games and Normal Matrices

no code implementations ICLR 2021 Carles Domingo-Enrich, Fabian Pedregosa, Damien Scieur

First, we show that for zero-sum bilinear games the average-case optimal method is the optimal method for the minimization of the Hamiltonian.

Average-case Acceleration Through Spectral Density Estimation

no code implementations12 Feb 2020 Fabian Pedregosa, Damien Scieur

We develop a framework for the average-case analysis of random quadratic problems and derive algorithms that are optimal under this analysis.

Density Estimation regression

Extragradient with player sampling for faster Nash equilibrium finding

1 code implementation29 May 2019 Carles Domingo Enrich, Samy Jelassi, Carles Domingo-Enrich, Damien Scieur, Arthur Mensch, Joan Bruna

Data-driven modeling increasingly requires to find a Nash equilibrium in multi-player games, e. g. when training GANs.

Nonlinear Acceleration of CNNs

1 code implementation1 Jun 2018 Damien Scieur, Edouard Oyallon, Alexandre d'Aspremont, Francis Bach

The Regularized Nonlinear Acceleration (RNA) algorithm is an acceleration method capable of improving the rate of convergence of many optimization schemes such as gradient descend, SAGA or SVRG.

Online Regularized Nonlinear Acceleration

no code implementations24 May 2018 Damien Scieur, Edouard Oyallon, Alexandre d'Aspremont, Francis Bach

Regularized nonlinear acceleration (RNA) estimates the minimum of a function by post-processing iterates from an algorithm such as the gradient method.

General Classification

Integration Methods and Optimization Algorithms

no code implementations NeurIPS 2017 Damien Scieur, Vincent Roulet, Francis Bach, Alexandre d'Aspremont

We show that accelerated optimization methods can be seen as particular instances of multi-step integration schemes from numerical analysis, applied to the gradient flow equation.

Nonlinear Acceleration of Stochastic Algorithms

no code implementations NeurIPS 2017 Damien Scieur, Francis Bach, Alexandre d'Aspremont

Here, we study extrapolation methods in a stochastic setting, where the iterates are produced by either a simple or an accelerated stochastic gradient algorithm.

Cannot find the paper you are looking for? You can Submit a new open access paper.