no code implementations • NeurIPS 2010 • M. P. Kumar, Benjamin Packer, Daphne Koller
Latent variable models are a powerful tool for addressing several tasks in machine learning.
no code implementations • NeurIPS 2009 • M. P. Kumar, Daphne Koller
The problem of approximating a given probability distribution using a simpler distribution plays an important role in several areas of machine learning, e. g. variational inference and classification.
no code implementations • NeurIPS 2008 • Philip Torr, M. P. Kumar
Compared to previous approaches based on the LP relaxation, e. g. interior-point algorithms or tree-reweighted message passing (TRW), our method is faster as it uses only the efficient st-mincut algorithm in its design.