Paper

Fast gradient descent for drifting least squares regression, with application to bandits

Online learning algorithms require to often recompute least squares regression estimates of parameters. We study improving the computational complexity of such algorithms by using stochastic gradient descent (SGD) type schemes in place of classic regression solvers... (read more)

Results in Papers With Code
(↓ scroll down to see all results)