no code implementations • 25 Jul 2023 • Lénaïc Chizat, Tomas Vaškevičius
We study the computation of doubly regularized Wasserstein barycenters, a recently introduced family of entropic barycenters governed by inner and outer regularization strengths.
no code implementations • 29 Jun 2023 • Jaouad Mourtada, Tomas Vaškevičius, Nikita Zhivotovskiy
In this paper, we revisit and tighten classical results in the theory of aggregation in the statistical setting by replacing the global complexity with a smaller, local one.
no code implementations • 25 Feb 2021 • Jaouad Mourtada, Tomas Vaškevičius, Nikita Zhivotovskiy
In this distribution-free regression setting, we show that boundedness of the conditional second moment of the response given the covariates is a necessary and sufficient condition for achieving nontrivial guarantees.
no code implementations • 19 Sep 2020 • Tomas Vaškevičius, Nikita Zhivotovskiy
We study the problem of predicting as well as the best linear predictor in a bounded Euclidean ball with respect to the squared loss.
no code implementations • NeurIPS 2020 • Tomas Vaškevičius, Varun Kanade, Patrick Rebeschini
Recently there has been a surge of interest in understanding implicit regularization properties of iterative gradient-based optimization algorithms.
1 code implementation • NeurIPS 2019 • Tomas Vaškevičius, Varun Kanade, Patrick Rebeschini
We investigate implicit regularization schemes for gradient descent methods applied to unpenalized least squares regression to solve the problem of reconstructing a sparse signal from an underdetermined system of linear measurements under the restricted isometry assumption.