no code implementations • NeurIPS 2020 • Yingcong Tan, Daria Terekhov, Andrew Delong
We propose a flexible gradient-based framework for learning linear programs from optimal decisions.
1 code implementation • 3 Dec 2018 • Yingcong Tan, Andrew Delong, Daria Terekhov
Given a set of observations generated by an optimization process, the goal of inverse optimization is to determine likely parameters of that process.
2 code implementations • 17 Dec 2017 • Nathan Killoran, Leo J. Lee, Andrew Delong, David Duvenaud, Brendan J. Frey
We propose generative neural network methods to generate DNA sequences and tune them to have desired properties.
no code implementations • CVPR 2014 • Lena Gorelick, Yuri Boykov, Olga Veksler, Ismail Ben Ayed, Andrew Delong
We propose a general optimization framework based on local submodular approximations (LSA).
no code implementations • 8 Nov 2013 • Lena Gorelick, Yuri Boykov, Olga Veksler, Ismail Ben Ayed, Andrew Delong
We propose a general optimization framework based on local submodular approximations (LSA).
no code implementations • NeurIPS 2012 • Andrew Delong, Olga Veksler, Anton Osokin, Yuri Boykov
Inference on high-order graphical models has become increasingly important in recent years.