PROMPT: Parallel Iterative Algorithm for $\ell_{p}$ norm linear regression via Majorization Minimization with an application to semi-supervised graph learning

23 Oct 2021  ·  R. Jyothi, P. Babu ·

In this paper, we consider the problem of $\ell_{p}$ norm linear regression, which has several applications such as in sparse recovery, data clustering, and semi-supervised learning. The problem, even though convex, does not enjoy a closed-form solution. The state-of-the-art algorithms are iterative but suffer from convergence issues, i.e., they either diverge for p>3 or the convergence to the optimal solution is sensitive to the initialization of the algorithm. Also, these algorithms are not generalizable to every possible value of $p$. In this paper, we propose an iterative algorithm : Parallel IteRative AlgOrithM for $\ell_{P}$ norm regression via MajorizaTion Minimization (PROMPT) based on the principle of Majorization Minimization and prove that the proposed algorithm is monotonic and converges to the optimal solution of the problem for any value of $p$. The proposed algorithm can also parallelly update each element of the regression variable, which helps to handle large scale data efficiently, a common scenario in this era of data explosion. Subsequently, we show that the proposed algorithm can also be applied for the graph based semi-supervised learning problem. We show through numerical simulations that the proposed algorithm converges to the optimal solution for any random initialization and also performs better than the state-of-the-art algorithms in terms of speed of convergence. We also evaluate the performance of the proposed algorithm using simulated and real data for the graph based semi-supervised learning problem.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods