# Oracle lower bounds for stochastic gradient sampling algorithms

1 Feb 2020

We consider the problem of sampling from a strongly log-concave density in $\mathbb{R}^d$, and prove an information theoretic lower bound on the number of stochastic gradient queries of the log density needed. Several popular sampling algorithms (including many Markov chain Monte Carlo methods) operate by using stochastic gradients of the log density to generate a sample; our results establish an information theoretic limit for all these algorithms... (read more)

PDF Abstract

# Code Add Remove Mark official

No code implementations yet. Submit your code now