# The promises and pitfalls of Stochastic Gradient Langevin Dynamics

Nicolas BrosseAlain DurmusEric Moulines

Stochastic Gradient Langevin Dynamics (SGLD) has emerged as a key MCMC algorithm for Bayesian learning from large scale datasets. While SGLD with decreasing step sizes converges weakly to the posterior distribution, the algorithm is often used with a constant step size in practice and has demonstrated successes in machine learning tasks... (read more)

PDF Abstract