Proxy causal learning (PCL) is a method for estimating the causal effect of treatments on outcomes in the presence of unobserved confounding, using proxies (structured side information) for the confounder.
Statistical tasks such as density estimation and approximate Bayesian inference often involve densities with unknown normalising constants.
We propose two nonparametric statistical tests of goodness of fit for conditional distributions: given a conditional probability density function $p(y|x)$ and a joint sample, decide whether the sample is drawn from $p(y|x)r_x(x)$ for some density $r_x$.
Models that employ latent variables to capture structure in observed data lie at the heart of many current unsupervised learning algorithms, but exact maximum-likelihood learning for powerful and flexible latent-variable models is almost always intractable.
We propose a kernel-based nonparametric test of relative goodness of fit, where the goal is to compare two models, both of which may have unobserved latent variables, such that the marginal distribution of the observed variables is intractable.
Given two candidate models, and a set of target observations, we address the problem of measuring the relative goodness of fit of the two models.
The behavior of users in certain services could be a clue that can be used to infer their preferences and may be used to make recommendations for other services they have never used.
We investigate the statistical performance and computational efficiency of the alternating minimization procedure for nonparametric tensor learning.