A Geometric Understanding of Natural Gradient

13 Feb 2022  ·  Qinxun Bai, Steven Rosenberg, Wei Xu ·

While natural gradients have been widely studied from both theoretical and empirical perspectives, we argue that some fundamental theoretical issues regarding the existence of gradients in infinite dimensional function spaces remain underexplored. We address these issues by providing a geometric perspective and mathematical framework for studying natural gradient that is more complete and rigorous than existing studies. Our results also establish new connections between natural gradients and RKHS theory, and specifically to the Neural Tangent Kernel (NTK). Based on our theoretical framework, we derive a new family of natural gradients induced by Sobolev metrics and develop computational techniques for efficient approximation in practice. Preliminary experimental results reveal the potential of this new natural gradient variant.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here