no code implementations • 15 Apr 2024 • Ou Hongjia, Andreas Themelis
Leveraging on recent advancements on adaptive methods for convex minimization problems, this paper provides a linesearch-free proximal gradient framework for globalizing the convergence of popular stepsize choices such as Barzilai-Borwein and one-dimensional Anderson acceleration.
no code implementations • 9 Feb 2024 • Konstantinos A. Oikonomidis, Emanuel Laude, Puya Latafat, Andreas Themelis, Panagiotis Patrinos
We show that adaptive proximal gradient methods for convex problems are not restricted to traditional Lipschitzian assumptions.
1 code implementation • 30 Nov 2023 • Puya Latafat, Andreas Themelis, Panagiotis Patrinos
Building upon recent works on linesearch-free adaptive proximal gradient methods, this paper proposes AdaPG$^{\pi, r}$, a framework that unifies and extends existing results by providing larger stepsize policies and improved lower bounds.
2 code implementations • 11 Jan 2023 • Puya Latafat, Andreas Themelis, Lorenzo Stella, Panagiotis Patrinos
Backtracking linesearch is the de facto approach for minimizing continuously differentiable functions with locally Lipschitz gradient.
no code implementations • 3 Jan 2023 • Stephen Hardy, Andreas Themelis, Kaoru Yamamoto, Hakan Ergun, Dirk Van Hertem
To this end a multi-period, stochastic GATE planning formulation is developed for both nodal and zonal market designs.
no code implementations • 17 Jul 2022 • Pourya Behmandpoor, Puya Latafat, Andreas Themelis, Marc Moonen, Panagiotis Patrinos
We introduce SPIRAL, a SuPerlinearly convergent Incremental pRoximal ALgorithm, for solving nonconvex regularized finite sum problems under a relative smoothness assumption.
no code implementations • 15 Mar 2021 • Miguel Simões, Andreas Themelis, Panagiotis Patrinos
Lasry-Lions envelopes can also be seen as an "intermediate" between a given function and its convex envelope, and we make use of this property to develop a method that builds a sequence of approximate subproblems that are easier to solve than the original problem.
1 code implementation • 6 Oct 2020 • Ben Hermans, Andreas Themelis, Panagiotis Patrinos
The resulting implementation is shown to be extremely robust in numerical simulations, solving all of the Maros-Meszaros problems and finding a stationary point for most of the nonconvex QPs in the Cutest test set.
Optimization and Control 90C05, 90C20, 90C26, 49J53, 49M15
1 code implementation • 20 May 2020 • Andreas Themelis, Lorenzo Stella, Panagiotis Patrinos
Although the performance of popular optimization algorithms such as Douglas-Rachford splitting (DRS) and the ADMM is satisfactory in small and well-scaled problems, ill conditioning and problem size pose a severe obstacle to their reliable employment.
Optimization and Control 90C06, 90C25, 90C26, 49J52, 49J53
3 code implementations • 24 Jun 2019 • Puya Latafat, Andreas Themelis, Panagiotis Patrinos
This paper analyzes block-coordinate proximal gradient methods for minimizing the sum of a separable smooth function and a (nonseparable) nonsmooth function, both of which are allowed to be nonconvex.
Optimization and Control 90C06, 90C25, 90C26, 49J52, 49J53
1 code implementation • 28 May 2019 • Masoud Ahookhosh, Andreas Themelis, Panagiotis Patrinos
We introduce Bella, a locally superlinearly convergent Bregman forward-backward splitting method for minimizing the sum of two nonconvex functions, one of which satisfying a relative smoothness condition and the other one possibly nonsmooth.
Optimization and Control 90C06, 90C25, 90C26, 49J52, 49J53
1 code implementation • 22 Sep 2016 • Andreas Themelis, Panagiotis Patrinos
As a result, SuperMann enhances and robustifies all operator splitting schemes for structured convex optimization, overcoming their well known sensitivity to ill conditioning.
Optimization and Control 47H09, 90C25, 90C53, 65K15
5 code implementations • 20 Jun 2016 • Andreas Themelis, Lorenzo Stella, Panagiotis Patrinos
Extending previous results we show that, despite being nonsmooth for fully nonconvex problems, the FBE still enjoys favorable first- and second-order properties which are key for the convergence results of ZeroFPR.
Optimization and Control 90C06, 90C25, 90C26, 90C53, 49J52, 49J53
2 code implementations • 27 Apr 2016 • Lorenzo Stella, Andreas Themelis, Panagiotis Patrinos
We propose an algorithmic scheme that enjoys the same global convergence properties of FBS when the problem is convex, or when the objective function possesses the Kurdyka-{\L}ojasiewicz property at its critical points.
Optimization and Control