no code implementations • 13 Feb 2023 • David Heckerman
This manuscript is technical memoir about my work at Stanford and Microsoft Research.
no code implementations • 13 May 2021 • David Heckerman, Dan Geiger
We develop simple methods for constructing likelihoods and parameter priors for learning about the parameters and structure of a Bayesian network.
no code implementations • 5 May 2021 • Dan Geiger, David Heckerman
We develop simple methods for constructing parameter priors for model choice among Directed Acyclic Graphical (DAG) models.
1 code implementation • 1 Feb 2020 • David Heckerman
A Bayesian network is a graphical model that encodes probabilistic relationships among variables of interest.
no code implementations • 6 Nov 2019 • David Heckerman
I then use these representations to build Pathfinder, a large normative expert system for the diagnosis of lymph-node diseases (the domain contains over 60 diseases and over 100 disease findings).
no code implementations • 22 Oct 2019 • David Heckerman, Chris Meek
Also, we show how to identify a non-redundant set of parameters for an EBNC, and describe an asymptotic approximation for learning the structure of Bayesian networks that contain EBNCs.
no code implementations • 2 Jan 2018 • David Heckerman
Identifying causal relationships from observation data is difficult, in large part, due to the presence of hidden common causes.
no code implementations • 27 Oct 2016 • Dan Geiger, David Heckerman
We examine three probabilistic concepts related to the sentence "two variables have no bearing on each other".
no code implementations • 27 Jul 2014 • Eric J. Horvitz, David Heckerman
Over the last decade, there has been growing interest in the use or measures or change in belief for reasoning with uncertainty in artificial intelligence research.
no code implementations • 27 Feb 2014 • Jack Kuipers, Giusi Moffa, David Heckerman
We provide a correction to the expression for scoring Gaussian directed acyclic graphical models derived in Geiger and Heckerman [Ann.
no code implementations • 13 Apr 2013 • David Heckerman, E. Mamdani
This is the Proceedings of the Ninth Conference on Uncertainty in Artificial Intelligence, which was held in Washington, DC, July 9-11, 1993
no code implementations • 27 Mar 2013 • David Heckerman, Eric J. Horvitz
In this paper, we extend the QMRDT probabilistic model for the domain of internal medicine to include decisions about treatments.
no code implementations • 27 Mar 2013 • David Heckerman
In this paper, an empirical evaluation of three inference methods for uncertain reasoning is presented in the context of Pathfinder, a large expert system for the diagnosis of lymph-node pathology.
no code implementations • 27 Mar 2013 • David Heckerman
A similarity network is a tool for constructing belief networks for the diagnosis of a single fault.
no code implementations • 27 Mar 2013 • David Heckerman, Holly B. Jimison
We present a representation of partial confidence in belief and preference that is consistent with the tenets of decision-theory.
no code implementations • 27 Mar 2013 • David Heckerman
The time complexity of the algorithm is O(nm-2m+), where n is the number of diseases, m+ is the number of positive findings and m- is the number of negative findings.
no code implementations • 27 Mar 2013 • David Heckerman
In the spirit of Cox, properties for a measure of change in belief are enumerated.
no code implementations • 27 Mar 2013 • Ross D. Shachter, David Heckerman
Much artificial intelligence research focuses on the problem of deducing the validity of unobservable propositions or hypotheses from observable evidence.!
no code implementations • 27 Mar 2013 • David Heckerman, John S. Breese, Eric J. Horvitz
We introduce and analyze the problem of the compilation of decision models from a decision-theoretic perspective.
no code implementations • 27 Mar 2013 • Michael P. Wellman, David Heckerman
Architectures for uncertainty handling that take statements in the calculus as objects to be reasoned about offer the prospect of retaining normative status with respect to decision making while supporting the other tasks in uncertain reasoning.
no code implementations • 27 Mar 2013 • David Heckerman, Eric J. Horvitz
However, we argue that in the case of plausible reasoning, rules are syntactically modular but are rarely semantically modular.
no code implementations • 27 Mar 2013 • David Heckerman
This inconsistency is used to argue for a redefinition of certainty factors in terms of the intuitively appealing desiderata associated with the combining functions.
no code implementations • 27 Mar 2013 • Jaap Suermondt, Gregory F. Cooper, David Heckerman
Cutset conditioning and clique-tree propagation are two popular methods for performing exact probabilistic inference in Bayesian belief networks.
no code implementations • 27 Mar 2013 • Dan Geiger, David Heckerman
We examine three probabilistic formulations of the sentence a and b are totally unrelated with respect to a given set of variables U.
no code implementations • 20 Mar 2013 • Dan Geiger, David Heckerman
This paper discuses multiple Bayesian networks representation paradigms for encoding asymmetric independence assertions.
no code implementations • 20 Mar 2013 • David Heckerman, Eric J. Horvitz, Blackford Middleton
Value-of-information analyses provide a straightforward means for selecting the best next observation to make, and for determining whether it is better to gather additional information or to act immediately.
no code implementations • 6 Mar 2013 • David Heckerman
I introduce a temporal belief-network representation of causal independence that a knowledge engineer can use to elicit probabilistic models.
no code implementations • 6 Mar 2013 • David Heckerman, Michael Shwe
We compare the diagnostic accuracy of three diagnostic inference models: the simple Bayes model, the multimembership Bayes model, which is isomorphic to the parallel combination function in the certainty-factor model, and a model that incorporates the noisy OR-gate interaction.
no code implementations • 6 Mar 2013 • Dan Geiger, David Heckerman
We examine two types of similarity networks each based on a distinct notion of relevance.
no code implementations • 27 Feb 2013 • Dan Geiger, David Heckerman
We describe algorithms for learning Bayesian networks from a combination of user knowledge and statistical data.
no code implementations • 27 Feb 2013 • David Heckerman, Dan Geiger, David Maxwell Chickering
Second, we describe local search and annealing algorithms to be used in conjunction with scoring metrics.
no code implementations • 27 Feb 2013 • David Heckerman, John S. Breese
In this representation, the interaction between causes and effect can be written as a nested decomposition of functions.
no code implementations • 27 Feb 2013 • David Heckerman, Ross D. Shachter
Using this definition, we show how causal dependence can be represented within an influence diagram.
no code implementations • 20 Feb 2013 • David Heckerman
We show that these new assumptions, when combined with parameter independence, parameter modularity, and likelihood equivalence, allow us to apply methods for learning acausal networks to learn causal networks.
no code implementations • 20 Feb 2013 • David Heckerman, Ross D. Shachter
We present a precise definition of cause and effect in terms of a fundamental notion called unresponsiveness.
no code implementations • 20 Feb 2013 • David Heckerman, Dan Geiger
We examine Bayesian methods for learning Bayesian networks from a combination of prior knowledge and statistical data.
no code implementations • 13 Feb 2013 • John S. Breese, David Heckerman
We develop and extend existing decision-theoretic methods for troubleshooting a nonfunctioning device.
no code implementations • 13 Feb 2013 • Dan Geiger, David Heckerman, Christopher Meek
We extend the Bayesian Information Criterion (BIC), an asymptotic approximation for the marginal likelihood, to Bayesian networks with hidden variables.
no code implementations • 13 Feb 2013 • David Maxwell Chickering, David Heckerman
We consider the Laplace approximation and the less accurate but more efficient BIC/MDL approximation.
no code implementations • 6 Feb 2013 • David Maxwell Chickering, David Heckerman, Christopher Meek
The majority of this work has concentrated on using decision-tree representations for the CPDs.
no code implementations • 6 Feb 2013 • Christopher Meek, David Heckerman
This paper discusses causal independence models and a generalization of these models called causal interaction models.
no code implementations • 30 Jan 2013 • Bo Thiesson, Christopher Meek, David Maxwell Chickering, David Heckerman
We describe computationally efficient methods for learning mixtures in which each component is a directed acyclic graphical model (mixtures of DAGs or MDAGs).
no code implementations • 30 Jan 2013 • Marina Meila, David Heckerman
In the first part of the paper, we perform an experimental comparison between three batch clustering algorithms: the Expectation-Maximization (EM) algorithm, a winner take all version of the EM algorithm reminiscent of the K-means algorithm, and model-based hierarchical agglomerative clustering.
no code implementations • 30 Jan 2013 • John S. Breese, David Heckerman, Carl Kadie
Results indicate that for a wide range of conditions, Bayesian networks with decision trees at each node and correlation methods outperform Bayesian-clustering and vector-similarity methods.
no code implementations • 30 Jan 2013 • David Heckerman, Eric J. Horvitz
People using consumer software applications typically do not use technical jargon when querying an online database of help topics.
no code implementations • 23 Jan 2013 • Dan Geiger, David Heckerman
We show that the only parameter prior for complete Gaussian DAG models that satisfies global parameter independence, complete model equivalence, and some weak regularity assumptions, is the normal-Wishart distribution.
no code implementations • 23 Jan 2013 • David Maxwell Chickering, David Heckerman
We describe two techniques that significantly improve the running time of several standard machine-learning algorithms when data is sparse.
no code implementations • 12 Dec 2012 • Guy Shani, Ronen I. Brafman, David Heckerman
We argue that it is more appropriate to view the problem of generating recommendations as a sequential decision problem and, consequently, that Markov decision processes (MDP) provide a more appropriate model for Recommender systems.
no code implementations • 12 Dec 2012 • Carl Kadie, Christopher Meek, David Heckerman
We describe CFW, a computationally efficient algorithm for collaborative filtering that uses posteriors over weights of evidence.
no code implementations • 13 Jun 2012 • Chong Wang, David Blei, David Heckerman
In contrast to the cDTM, the original discrete-time dynamic topic model (dDTM) requires that time be discretized.
no code implementations • 3 May 2012 • Jennifer Listgarten, Christoph Lippert, Eun Yong Kang, Jing Xiang, Carl M. Kadie, David Heckerman
Until now, these approaches did not address confounding by family relatedness and population structure, a problem that is becoming more important as larger data sets are used to increase power.