A survey of dimensionality reduction techniques based on random projection

14 Jun 2017  ·  Haozhe Xie, Jie Li, Hanqing Xue ·

Dimensionality reduction techniques play important roles in the analysis of big data. Traditional dimensionality reduction approaches, such as principal component analysis (PCA) and linear discriminant analysis (LDA), have been studied extensively in the past few decades. However, as the dimensionality of data increases, the computational cost of traditional dimensionality reduction methods grows exponentially, and the computation becomes prohibitively intractable. These drawbacks have triggered the development of random projection (RP) techniques, which map high-dimensional data onto a low-dimensional subspace with extremely reduced time cost. However, the RP transformation matrix is generated without considering the intrinsic structure of the original data and usually leads to relatively high distortion. Therefore, in recent years, methods based on RP have been proposed to address this problem. In this paper, we summarize the methods used in different situations to help practitioners to employ the proper techniques for their specific applications. Meanwhile, we enumerate the benefits and limitations of the various methods and provide further references for researchers to develop novel RP-based approaches.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here