Derivation of Coupled PCA and SVD Learning Rules from a Newton Zero-Finding Framework

25 Mar 2020  ·  Ralf Möller ·

In coupled learning rules for PCA (principal component analysis) and SVD (singular value decomposition), the update of the estimates of eigenvectors or singular vectors is influenced by the estimates of eigenvalues or singular values, respectively. This coupled update mitigates the speed-stability problem since the update equations converge from all directions with approximately the same speed. A method to derive coupled learning rules from information criteria by Newton optimization is known. However, these information criteria have to be designed, offer no explanatory value, and can only impose Euclidean constraints on the vector estimates. Here we describe an alternative approach where coupled PCA and SVD learning rules can systematically be derived from a Newton zero-finding framework. The derivation starts from an objective function, combines the equations for its extrema with arbitrary constraints on the vector estimates, and solves the resulting vector zero-point equation using Newton's zero-finding method. To demonstrate the framework, we derive PCA and SVD learning rules with constant Euclidean length or constant sum of the vector estimates.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods