Extension of PCA to Higher Order Data Structures: An Introduction to Tensors, Tensor Decompositions, and Tensor PCA

26 Jun 2018  ·  Zare Ali, Ozdemir Alp, Iwen Mark A., Aviyente Selin ·

The widespread use of multisensor technology and the emergence of big data sets have brought the necessity to develop more versatile tools to represent higher-order data with multiple aspects and high dimensionality. Data in the form of multidimensional arrays, also referred to as tensors, arises in a variety of applications including chemometrics, hyperspectral imaging, high resolution videos, neuroimaging, biometrics, and social network analysis. Early multiway data analysis approaches reformatted such tensor data as large vectors or matrices and then resorted to dimensionality reduction methods developed for classical two-way analysis such as PCA. However, one cannot discover hidden components within multiway data using conventional PCA. To this end, tensor decomposition methods which are flexible in the choice of the constraints and that extract more general latent components have been proposed. In this paper, we review the major tensor decomposition methods with a focus on problems targeted by classical PCA. In particular, we present tensor methods that aim to solve three important challenges typically addressed by PCA: dimensionality reduction, i.e. low-rank tensor approximation, supervised learning, i.e. learning linear subspaces for feature extraction, and robust low-rank tensor recovery. We also provide experimental results to compare different tensor models for both dimensionality reduction and supervised learning applications.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here