no code implementations • 29 Jan 2021 • Vladimir Kolmogorov, Thomas Pock
In case $h^*$ is the indicator function of a linear constraint and function $f$ is quadratic, we show a $O(1/n^2)$ convergence rate on the dual objective, requiring $O(n \log n)$ calls of $lmo$.
Optimization and Control
no code implementations • CVPR 2023 • Vladimir Kolmogorov
We consider the problem of solving LP relaxations of MAP-MRF inference problems, and in particular the method proposed recently in (Swoboda, Kolmogorov 2019; Kolmogorov, Pock 2021).
1 code implementation • CVPR 2019 • Paul Swoboda, Vladimir Kolmogorov
We present a new proximal bundle method for Maximum-A-Posteriori (MAP) inference in structured energy minimization problems.
no code implementations • CVPR 2018 • Pritish Mohapatra, Michal Rolinek, C. V. Jawahar, Vladimir Kolmogorov, M. Pawan Kumar
We provide a complete characterization of the loss functions that are amenable to our algorithm, and show that it includes both AP and NDCG based loss functions.
no code implementations • 26 Feb 2015 • Vladimir Kolmogorov, Thomas Pock, Michal Rolinek
We consider the problem of minimizing the continuous valued total variation subject to different unary terms on trees and propose fast direct algorithms based on dynamic programming to solve these problems.
no code implementations • CVPR 2015 • Neel Shah, Vladimir Kolmogorov, Christoph H. Lampert
Structural support vector machines (SSVMs) are amongst the best performing models for structured computer vision tasks, such as semantic image segmentation or human pose estimation.
no code implementations • 22 Apr 2014 • Rustem Takhanov, Vladimir Kolmogorov
We propose a {\em Grammatical Pattern-Based CRF model }(\GPB) that combines the two in a natural way.
no code implementations • 14 Apr 2014 • Vladimir Kolmogorov, Christoph Lampert, Emilie Morvant, Rustem Takhanov
The 38th Annual Workshop of the Austrian Association for Pattern Recognition (\"OAGM) will be held at IST Austria, on May 22-23, 2014.
no code implementations • 7 Oct 2013 • Igor Gridchyn, Vladimir Kolmogorov
Furthermore, the output of our algorithm allows to speed-up the subsequent alpha expansion for the unlabeled part, or can be used as it is for time-critical applications.
no code implementations • 22 Sep 2013 • Vladimir Kolmogorov
We propose a new family of message passing techniques for MAP estimation in graphical models which we call {\em Sequential Reweighted Message Passing} (SRMP).
no code implementations • 7 Mar 2013 • Carl Olsson, Johannes Ulen, Yuri Boykov, Vladimir Kolmogorov
Energies with high-order non-submodular interactions have been shown to be very useful in vision due to their high modeling power.
no code implementations • 1 Oct 2012 • Rustem Takhanov, Vladimir Kolmogorov
(Komodakis & Paragios, 2009) gave an $O(n L)$ algorithm for computing the MAP.
no code implementations • NeurIPS 2010 • Vladimir Kolmogorov
We say that the relaxation is {\em totally half-integral} if $\hat f(\bx)$ is a polyhedral function with half-integral extreme points $\bx$, and this property is preserved after adding an arbitrary combination of constraints of the form $x_i=x_j$, $x_i=1-x_j$, and $x_i=\gamma$ where $\gamma\in\{0, 1,\frac{1}{2}\}$ is a constant.