Paper

Block Mean Approximation for Efficient Second Order Optimization

Advanced optimization algorithms such as Newton method and AdaGrad benefit from second order derivative or second order statistics to achieve better descent directions and faster convergence rates. At their heart, such algorithms need to compute the inverse or inverse square root of a matrix whose size is quadratic of the dimensionality of the search space... (read more)

Results in Papers With Code
(↓ scroll down to see all results)