1 code implementation • NeurIPS 2023 • Harry Bendekgey, Gabriel Hope, Erik B. Sudderth
By composing graphical models with deep learning architectures, we learn generative models with the strengths of both frameworks.
no code implementations • pproximateinference AABI Symposium 2022 • Gabriel Hope, Madina Abdrakhmanova, Xiaoyin Chen, Michael C Hughes, Erik B. Sudderth
We consider training deep generative models toward two simultaneous goals: discriminative classification and generative modeling using an explicit likelihood.
no code implementations • NeurIPS 2021 • Harry Bendekgey, Erik B. Sudderth
We investigate how fairness relaxations scale to flexible classifiers like deep neural networks for images and text.
no code implementations • 12 Dec 2020 • Gabriel Hope, Madina Abdrakhmanova, Xiaoyin Chen, Michael C. Hughes, Erik B. Sudderth
We develop a new framework for learning variational autoencoders and other deep generative models that balances generative and discriminative goals.
no code implementations • 11 Jun 2019 • Zhile Ren, Erik B. Sudderth
We develop new representations and algorithms for three-dimensional (3D) object detection and spatial layout prediction in cluttered indoor scenes.
Ranked #22 on 3D Object Detection on SUN-RGBD val
no code implementations • ICCV 2019 • Daeyun Shin, Zhile Ren, Erik B. Sudderth, Charless C. Fowlkes
We tackle the problem of automatically reconstructing a complete 3D model of a scene from a single RGB image.
2 code implementations • 23 Oct 2018 • Zhile Ren, Orazio Gallo, Deqing Sun, Ming-Hsuan Yang, Erik B. Sudderth, Jan Kautz
To date, top-performing optical flow estimation methods only take pairs of consecutive frames into account.
no code implementations • CVPR 2018 • Zhile Ren, Erik B. Sudderth
We develop a 3D object detection algorithm that uses latent support surfaces to capture contextual relationships in indoor scenes.
Ranked #24 on 3D Object Detection on SUN-RGBD val
no code implementations • 1 Dec 2017 • Michael C. Hughes, Gabriel Hope, Leah Weiner, Thomas H. McCoy, Roy H. Perlis, Erik B. Sudderth, Finale Doshi-Velez
Supervisory signals can help topic models discover low-dimensional data representations that are more interpretable for clinical tasks.
no code implementations • 10 Nov 2017 • Geng Ji, Robert Bamler, Erik B. Sudderth, Stephan Mandt
Word2vec (Mikolov et al., 2013) has proven to be successful in natural language processing by capturing the semantic relationships between different words.
1 code implementation • ICML 2017 • Geng Ji, Michael C. Hughes, Erik B. Sudderth
Our model is based on a novel, variational interpretation of the popular expected patch log-likelihood (EPLL) method as a model for randomly positioned grids of image patches.
no code implementations • 26 Jul 2017 • Zhile Ren, Deqing Sun, Jan Kautz, Erik B. Sudderth
Given two consecutive frames from a pair of stereo cameras, 3D scene flow methods simultaneously estimate the 3D geometry and motion of the observed scene.
no code implementations • 23 Jul 2017 • Michael C. Hughes, Leah Weiner, Gabriel Hope, Thomas H. McCoy Jr., Roy H. Perlis, Erik B. Sudderth, Finale Doshi-Velez
Supervisory signals have the potential to make low-dimensional data representations, like those learned by mixture and topic models, more interpretable and useful.
no code implementations • 23 Sep 2016 • Michael C. Hughes, Erik B. Sudderth
Mixture models and topic models generate each observation from a single cluster, but standard variational posteriors for each observation assign positive probability to all possible clusters.
no code implementations • CVPR 2016 • Zhile Ren, Erik B. Sudderth
We develop new representations and algorithms for three-dimensional (3D) object detection and spatial layout prediction in cluttered indoor scenes.
Ranked #25 on 3D Object Detection on SUN-RGBD val
no code implementations • CVPR 2015 • Deqing Sun, Erik B. Sudderth, Hanspeter Pfister
As consumer depth sensors become widely available, estimating scene flow from RGBD sequences has received increasing attention.
no code implementations • 22 Aug 2013 • Emily B. Fox, Michael C. Hughes, Erik B. Sudderth, Michael. I. Jordan
We propose a Bayesian nonparametric approach to the problem of jointly modeling multiple related time series.
no code implementations • CVPR 2013 • Deqing Sun, Jonas Wulff, Erik B. Sudderth, Hanspeter Pfister, Michael J. Black
Layered models allow scene segmentation and motion estimation to be formulated together and to inform one another.
2 code implementations • NeurIPS 2012 • Soumya Ghosh, Matthew Loper, Erik B. Sudderth, Michael J. Black
We develop a method for discovering the parts of an articulated object from aligned meshes capturing various three-dimensional (3D) poses.
no code implementations • NeurIPS 2012 • Jason Pacheco, Erik B. Sudderth
We develop convergent minimization algorithms for Bethe variational approximations which explicitly constrain marginal estimates to families of valid distributions.
no code implementations • NeurIPS 2012 • Michael C. Hughes, Emily Fox, Erik B. Sudderth
Applications of Bayesian nonparametric methods require learning and inference algorithms which efficiently explore models of unbounded complexity.
no code implementations • NeurIPS 2012 • Michael Bryant, Erik B. Sudderth
Variational methods provide a computationally scalable alternative to Monte Carlo methods for large-scale, Bayesian nonparametric learning.
no code implementations • NeurIPS 2011 • Dae I. Kim, Erik B. Sudderth
Topic models are learned via a statistical model of variation within document collections, but designed to extract meaningful semantic structure.
no code implementations • NeurIPS 2011 • Soumya Ghosh, Andrei B. Ungureanu, Erik B. Sudderth, David M. Blei
The distance dependent Chinese restaurant process (ddCRP) was recently introduced to accommodate random partitions of non-exchangeable data.
no code implementations • NeurIPS 2010 • Nimar Arora, Stuart J. Russell, Paul Kidwell, Erik B. Sudderth
The International Monitoring System (IMS) is a global network of sensors whose purpose is to identify potential violations of the Comprehensive Nuclear-Test-Ban Treaty (CTBT), primarily through detection and localization of seismic events.
no code implementations • NeurIPS 2010 • Deqing Sun, Erik B. Sudderth, Michael J. Black
We present a new probabilistic model of optical flow in layers that addresses many of the shortcomings of previous approaches.
1 code implementation • 19 Mar 2010 • Emily B. Fox, Erik B. Sudderth, Michael I. Jordan, Alan S. Willsky
Many complex dynamical phenomena can be effectively modeled by a system that switches among a set of conditionally linear dynamical modes.
no code implementations • NeurIPS 2009 • Emily Fox, Michael. I. Jordan, Erik B. Sudderth, Alan S. Willsky
We propose a Bayesian nonparametric approach to relating multiple time series via a set of latent, dynamical behaviors.
no code implementations • 15 May 2009 • Emily B. Fox, Erik B. Sudderth, Michael. I. Jordan, Alan S. Willsky
To address this problem, we take a Bayesian nonparametric approach to speaker diarization that builds on the hierarchical Dirichlet process hidden Markov model (HDP-HMM) of Teh et al. [J. Amer.
no code implementations • NeurIPS 2008 • Erik B. Sudderth, Michael. I. Jordan
We develop a statistical framework for the simultaneous, unsupervised segmentation and discovery of visual object categories from image databases.
no code implementations • NeurIPS 2008 • Emily Fox, Erik B. Sudderth, Michael. I. Jordan, Alan S. Willsky
Many nonlinear dynamical phenomena can be effectively modeled by a system that switches among a set of conditionally linear dynamical modes.
no code implementations • NeurIPS 2007 • Alan S. Willsky, Erik B. Sudderth, Martin J. Wainwright
Variational methods are frequently used to approximate or bound the partition or likelihood function of a Markov random field.