# Stochastic gradient descent performs variational inference, converges to limit cycles for deep networks

Pratik ChaudhariStefano Soatto

Stochastic gradient descent (SGD) is widely believed to perform implicit regularization when used to train deep neural networks, but the precise manner in which this occurs has thus far been elusive. We prove that SGD minimizes an average potential over the posterior distribution of weights along with an entropic regularization term... (read more)

PDF Abstract