1 code implementation • 28 Nov 2023 • Jean Ogier du Terrail, Quentin Klopfenstein, Honghao Li, Imke Mayer, Nicolas Loiseau, Mohammad Hallal, Michael Debouver, Thibault Camalon, Thibault Fouqueray, Jorge Arellano Castro, Zahia Yanes, Laetitia Dahan, Julien Taïeb, Pierre Laurent-Puig, Jean-Baptiste Bachet, Shulin Zhao, Remy Nicolle, Jérome Cros, Daniel Gonzalez, Robert Carreras-Torres, Adelaida Garcia Velasco, Kawther Abdilleh, Sudheer Doss, Félix Balazard, Mathieu Andreux
External control arms (ECA) can inform the early clinical development of experimental drugs and provide efficacy evidence for regulatory approval.
2 code implementations • 26 Oct 2022 • Johan Larsson, Quentin Klopfenstein, Mathurin Massias, Jonas Wallin
The lasso is the most famous sparse regression and feature selection method.
3 code implementations • 27 Jun 2022 • Thomas Moreau, Mathurin Massias, Alexandre Gramfort, Pierre Ablin, Pierre-Antoine Bannier, Benjamin Charlier, Mathieu Dagréou, Tom Dupré La Tour, Ghislain Durif, Cassio F. Dantas, Quentin Klopfenstein, Johan Larsson, En Lai, Tanguy Lefort, Benoit Malézieux, Badr Moufad, Binh T. Nguyen, Alain Rakotomamonjy, Zaccharie Ramzi, Joseph Salmon, Samuel Vaiter
Numerical validation is at the core of machine learning research as it allows to assess the actual impact of new methods, and to confirm the agreement between theory and practice.
2 code implementations • 16 Apr 2022 • Quentin Bertrand, Quentin Klopfenstein, Pierre-Antoine Bannier, Gauthier Gidel, Mathurin Massias
We propose a new fast algorithm to estimate any sparse generalized linear model with convex or non-convex separable penalties.
1 code implementation • 4 May 2021 • Quentin Bertrand, Quentin Klopfenstein, Mathurin Massias, Mathieu Blondel, Samuel Vaiter, Alexandre Gramfort, Joseph Salmon
Finding the optimal hyperparameters of a model can be cast as a bilevel optimization problem, typically solved using zero-order techniques.
no code implementations • 22 Oct 2020 • Quentin Klopfenstein, Quentin Bertrand, Alexandre Gramfort, Joseph Salmon, Samuel Vaiter
For composite nonsmooth optimization problems, Forward-Backward algorithm achieves model identification (e. g. support identification for the Lasso) after a finite number of iterations, provided the objective function is regular enough.
1 code implementation • ICML 2020 • Quentin Bertrand, Quentin Klopfenstein, Mathieu Blondel, Samuel Vaiter, Alexandre Gramfort, Joseph Salmon
Our approach scales to high-dimensional data by leveraging the sparsity of the solutions.
1 code implementation • 6 Nov 2019 • Quentin Klopfenstein, Samuel Vaiter
This paper studies the addition of linear constraints to the Support Vector Regression (SVR) when the kernel is linear.