Semi-discrete optimal transport problems, which evaluate the Wasserstein distance between a discrete and a generic (possibly non-discrete) probability measure, are believed to be computationally hard.
The proposed model can be viewed as a zero-sum game between a statistician choosing an estimator -- that is, a measurable function of the observation -- and a fictitious adversary choosing a prior -- that is, a pair of signal and noise distributions ranging over independent Wasserstein balls -- with the goal to minimize and maximize the expected squared estimation error, respectively.
The likelihood function is a fundamental component in Bayesian statistics.
A fundamental problem arising in many areas of machine learning is the evaluation of the likelihood of a given observation under different nominal distributions.
The goal of data-driven decision-making is to learn a decision from finitely many training samples that will perform well on unseen test samples.
Despite the non-convex nature of the ambiguity set, we prove that the estimation problem is equivalent to a tractable convex program.