1 code implementation • 13 Jun 2023 • Tanguy Marchand, Régis Loeb, Ulysse Marteau-Ferey, Jean Ogier du Terrail, Arthur Pignet
We consider a cross-silo federated learning (FL) setting where a machine learning model with a fully connected first layer is trained between different clients and a central server using FedAvg, and where the aggregation step can be performed with secure aggregation (SA).
no code implementations • 20 Oct 2021 • Ulysse Marteau-Ferey, Francis Bach, Alessandro Rudi
In many areas of applied statistics and machine learning, generating an arbitrary number of independent and identically distributed (i. i. d.)
no code implementations • 22 Dec 2020 • Alessandro Rudi, Ulysse Marteau-Ferey, Francis Bach
We consider the global minimization of smooth functions based solely on function evaluations.
1 code implementation • NeurIPS 2020 • Ulysse Marteau-Ferey, Francis Bach, Alessandro Rudi
The paper is complemented by an experimental evaluation of the model showing its effectiveness in terms of formulation, algorithmic derivation and practical results on the problems of density estimation, regression with heteroscedastic errors, and multiple quantile regression.
2 code implementations • NeurIPS 2019 • Ulysse Marteau-Ferey, Francis Bach, Alessandro Rudi
In this paper, we study large-scale convex optimization algorithms based on the Newton method applied to regularized generalized self-concordant losses, which include logistic regression and softmax regression.
no code implementations • 8 Feb 2019 • Ulysse Marteau-Ferey, Dmitrii Ostrovskii, Francis Bach, Alessandro Rudi
We consider learning methods based on the regularization of a convex empirical risk by a squared Hilbertian norm, a setting that includes linear predictors and non-linear predictors through positive-definite kernels.