Generalized Nesterov's Acceleration-incorporated Non-negative and Adaptive Latent Factor Analysis

A non-negative latent factor (NLF) model with a single latent factor-dependent, non-negative and multiplicative update (SLF-NMU) algorithm is frequently adopted to extract useful knowledge from non-negative data represented by high-dimensional and sparse (HiDS) matrices arising from various service applications. However, its convergence rate is rather slow. To address this issue, this study proposes a Generalized Nesterov's acceleration-incorporated, Non-negative and Adaptive Latent Factor (GNALF) model. It results from a) incorporating a generalized Nesterov's accelerated gradient (NAG) method into an SLF-NMU algorithm, thereby achieving an NAG-incorporated and element-oriented non-negative (NEN) algorithm to perform efficient parameter update; and b) making its regularization and acceleration hyper-parameters self-adaptive via incorporating the principle of a particle swarm optimization algorithm into the training process, thereby implementing a highly adaptive and practical model. Empirical studies on six large sparse matrices from different recommendation service applications show that a GNALF model achieves very high convergence rate without the need of hyper-parameter tuning, making its computational efficiency significantly higher than state-of-the-art models. Meanwhile, such efficiency gain does not result in accuracy loss, since its prediction accuracy is comparable with its peers. Hence, it can better serve practical service applications with real-time demands.

PDF
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here