Grassmann Iterative Linear Discriminant Analysis with Proxy Matrix Optimization

16 Apr 2021  ·  Navya Nagananda, Breton Minnehan, Andreas Savakis ·

Linear Discriminant Analysis (LDA) is commonly used for dimensionality reduction in pattern recognition and statistics. It is a supervised method that aims to find the most discriminant space of reduced dimension that can be further used for classification. In this work, we present a Grassmann Iterative LDA method (GILDA) that is based on Proxy Matrix Optimization (PMO). PMO makes use of automatic differentiation and stochastic gradient descent (SGD) on the Grassmann manifold to arrive at the optimal projection matrix. Our results show that GILDAoutperforms the prevailing manifold optimization method.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods