no code implementations • NeurIPS Workshop AI4Scien 2021 • Ralph Ma, Gabriel Hart Stocker Dreiman, Fiorella Ruggiu, Adam Joseph Riesselman, Bowen Liu, Keith James, Mohammad Sultan, Daphne Koller
DNA encoded libraries (DELs) are pooled, combinatorial compound collections where each member is tagged with its own unique DNA barcode.
no code implementations • 20 Aug 2018 • Samuel S. Schoenholz, Sean Hackett, Laura Deming, Eugene Melamud, Navdeep Jaitly, Fiona McAllister, Jonathon O'Brien, George Dahl, Bryson Bennett, Andrew M. Dai, Daphne Koller
As in many other scientific domains, we face a fundamental problem when using machine learning to identify proteins from mass spectrometry data: large ground truth datasets mapping inputs to correct outputs are extremely difficult to obtain.
1 code implementation • 12 Jul 2018 • Emma Pierson, Pang Wei Koh, Tatsunori Hashimoto, Daphne Koller, Jure Leskovec, Nicholas Eriksson, Percy Liang
Motivated by the study of human aging, we present an interpretable latent-variable model that learns temporal dynamics from cross-sectional data.
no code implementations • 9 Jul 2013 • Chris Piech, Jonathan Huang, Zhenghao Chen, Chuong Do, Andrew Ng, Daphne Koller
In massive open online courses (MOOCs), peer grading serves as a critical tool for scaling the grading of complex, open-ended assignments to courses with tens or hundreds of thousands of students.
1 code implementation • 13 Feb 2013 • Craig Boutilier, Nir Friedman, Moises Goldszmidt, Daphne Koller
Bayesian networks provide a language for qualitatively representing the conditional independence properties of a distribution.
no code implementations • 19 Jan 2013 • John Breese, Daphne Koller
This is the Proceedings of the Seventeenth Conference on Uncertainty in Artificial Intelligence, which was held in Seattle, WA, August 2-5 2001
no code implementations • NeurIPS 2012 • Kevin Tang, Vignesh Ramanathan, Li Fei-Fei, Daphne Koller
In this paper, we tackle the problem of adapting object detectors learned from images to work well on videos.
no code implementations • 4 Jul 2012 • Uri Nodelman, Christian R. Shelton, Daphne Koller
A CTBN is a directed (possibly cyclic) dependency graph over a set of variables, each of which represents a finite state continuous time Markov process whose transition model is a function of its parents.
no code implementations • 4 Jul 2012 • Uri Nodelman, Daphne Koller, Christian R. Shelton
Continuous time Bayesian networks (CTBNs) describe structured stochastic processes with finitely many states that evolve over continuous time.
no code implementations • NeurIPS 2011 • Tianshi Gao, Daphne Koller
Many of these tasks are tackled by constructing a set of classifiers, which are then applied at test time and then pieced together in a fixed procedure determined in advance or at training time.
no code implementations • NeurIPS 2010 • M. P. Kumar, Benjamin Packer, Daphne Koller
Latent variable models are a powerful tool for addressing several tasks in machine learning.
no code implementations • NeurIPS 2009 • Stephen Gould, Tianshi Gao, Daphne Koller
Object detection and multi-class image segmentation are two closely related tasks that can be greatly improved when solved jointly by feeding information from one task to the other.
no code implementations • NeurIPS 2009 • M. P. Kumar, Daphne Koller
The problem of approximating a given probability distribution using a simpler distribution plays an important role in several areas of machine learning, e. g. variational inference and classification.
no code implementations • NeurIPS 2008 • Geremy Heitz, Stephen Gould, Ashutosh Saxena, Daphne Koller
We demonstrate the effectiveness of our method on a large set of natural images by combining the subtasks of scene categorization, object detection, multiclass image segmentation, and 3d scene reconstruction.