1 code implementation • ICML 2020 • Mathias Staudigl, Pavel Dvurechenskii, Shimrit Shtern, Kamil Safin, Petr Ostroukhov
Projection-free optimization via different variants of the Frank-Wolfe (FW) method has become one of the cornerstones in optimization for machine learning since in many cases the linear minimization oracle is much cheaper to implement than projections and some sparsity needs to be preserved.
no code implementations • 7 Jul 2022 • Pavel Dvurechensky, Shimrit Shtern, Mathias Staudigl
We propose a new homotopy-based conditional gradient method for solving convex optimization problems with a large number of simple conic constraints.
no code implementations • 25 Mar 2021 • Olivier Bilenne, Paulin Jacquot, Nadia Oudjane, Mathias Staudigl, Cheng Wan, Barbara Franci
An important issue in today's electricity markets is the management of flexibilities offered by new practices, such as smart home appliances or electric vehicles.
no code implementations • 4 Jan 2021 • Pavel Dvurechensky, Mathias Staudigl, Shimrit Shtern
In this survey we cover a number of key developments in gradient-based optimization methods.
no code implementations • 8 Oct 2020 • Duong Viet Thong, Aviv Gibali, Mathias Staudigl, Phan Tu Vuong
Dynamic user equilibrium (DUE) is a Nash-like solution concept describing an equilibrium in dynamic traffic systems over a fixed planning period.
1 code implementation • 11 Feb 2020 • Pavel Dvurechensky, Petr Ostroukhov, Kamil Safin, Shimrit Shtern, Mathias Staudigl
Projection-free optimization via different variants of the Frank-Wolfe (FW), a. k. a.
no code implementations • 4 Nov 2019 • Pavel Dvurechensky, Mathias Staudigl, César A. Uribe
Many problems in statistical learning, imaging, and computer vision involve the optimization of a non-convex objective function with singularities at the boundary of the feasible set.
no code implementations • 9 Feb 2019 • Radu Ioan Bot, Panayotis Mertikopoulos, Mathias Staudigl, Phan Tu Vuong
We develop a new stochastic algorithm with variance reduction for solving pseudo-monotone stochastic variational inequalities.
no code implementations • 25 Sep 2018 • Immanuel M. Bomze, Panayotis Mertikopoulos, Werner Schachinger, Mathias Staudigl
In the case of linearly constrained quadratic programs (not necessarily convex), we also show that the method's convergence rate is $\mathcal{O}(1/k^\rho)$ for some $\rho\in(0, 1]$ that depends only on the choice of kernel function (i. e., not on the problem's primitives).
no code implementations • 10 Sep 2018 • Benoit Duvocelle, Panayotis Mertikopoulos, Mathias Staudigl, Dries Vermeulen
We examine the long-run behavior of multi-agent online learning in games that evolve over time.
no code implementations • 21 Nov 2016 • Panayotis Mertikopoulos, Mathias Staudigl
In the vanishing noise limit, we show that the dynamics converge to the solution set of the underlying problem (a. s.).