Belief Propagation for Linear Programming

17 May 2013  ·  Andrew Gelfand, Jinwoo Shin, Michael Chertkov ·

Belief Propagation (BP) is a popular, distributed heuristic for performing MAP computations in Graphical Models. BP can be interpreted, from a variational perspective, as minimizing the Bethe Free Energy (BFE). BP can also be used to solve a special class of Linear Programming (LP) problems. For this class of problems, MAP inference can be stated as an integer LP with an LP relaxation that coincides with minimization of the BFE at ``zero temperature". We generalize these prior results and establish a tight characterization of the LP problems that can be formulated as an equivalent LP relaxation of MAP inference. Moreover, we suggest an efficient, iterative annealing BP algorithm for solving this broader class of LP problems. We demonstrate the algorithm's performance on a set of weighted matching problems by using it as a cutting plane method to solve a sequence of LPs tightened by adding ``blossom'' inequalities.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here