Sample Complexity of Sample Average Approximation for Conditional Stochastic Optimization

28 May 2019  ·  Yifan Hu, Xin Chen, Niao He ·

In this paper, we study a class of stochastic optimization problems, referred to as the \emph{Conditional Stochastic Optimization} (CSO), in the form of $\min_{x \in \mathcal{X}} \EE_{\xi}f_\xi\Big({\EE_{\eta|\xi}[g_\eta(x,\xi)]}\Big)$, which finds a wide spectrum of applications including portfolio selection, reinforcement learning, robust learning, causal inference and so on. Assuming availability of samples from the distribution $\PP(\xi)$ and samples from the conditional distribution $\PP(\eta|\xi)$, we establish the sample complexity of the sample average approximation (SAA) for CSO, under a variety of structural assumptions, such as Lipschitz continuity, smoothness, and error bound conditions. We show that the total sample complexity improves from $\cO(d/\eps^4)$ to $\cO(d/\eps^3)$ when assuming smoothness of the outer function, and further to $\cO(1/\eps^2)$ when the empirical function satisfies the quadratic growth condition. We also establish the sample complexity of a modified SAA, when $\xi$ and $\eta$ are independent. Several numerical experiments further support our theoretical findings. Keywords: stochastic optimization, sample average approximation, large deviations theory

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods