no code implementations • 9 Feb 2024 • Konstantinos A. Oikonomidis, Emanuel Laude, Puya Latafat, Andreas Themelis, Panagiotis Patrinos
We show that adaptive proximal gradient methods for convex problems are not restricted to traditional Lipschitzian assumptions.
no code implementations • 13 Jul 2021 • Hartmut Bauermeister, Emanuel Laude, Thomas Möllenhoff, Michael Moeller, Daniel Cremers
In contrast to existing discretizations which suffer from a grid bias, we show that a piecewise polynomial discretization better preserves the continuous nature of our problem.
no code implementations • 8 Oct 2019 • Mahesh Chandra Mukkamala, Felix Westerkamp, Emanuel Laude, Daniel Cremers, Peter Ochs
This initiated the development of the Bregman proximal gradient (BPG) algorithm and an inertial variant (momentum based) CoCaIn BPG, which however rely on problem dependent Bregman distances.
no code implementations • 27 Mar 2019 • Emanuel Laude, Tao Wu, Daniel Cremers
In this work, we consider nonconvex composite problems that involve inf-convolution with a Legendre function, which gives rise to an anisotropic generalization of the proximal mapping and Moreau-envelope.
no code implementations • CVPR 2018 • Emanuel Laude, Jan-Hendrik Lange, Jonas Schüpfer, Csaba Domokos, Laura Leal-Taixé, Frank R. Schmidt, Bjoern Andres, Daniel Cremers
This paper introduces a novel algorithm for transductive inference in higher-order MRFs, where the unary energies are parameterized by a variable classifier.
1 code implementation • 7 Apr 2016 • Emanuel Laude, Thomas Möllenhoff, Michael Moeller, Jan Lellmann, Daniel Cremers
Convex relaxations of nonconvex multilabel problems have been demonstrated to produce superior (provably optimal or near-optimal) solutions to a variety of classical computer vision problems.
2 code implementations • CVPR 2016 • Thomas Möllenhoff, Emanuel Laude, Michael Moeller, Jan Lellmann, Daniel Cremers
We propose a novel spatially continuous framework for convex relaxations based on functional lifting.