On the convergence of the IRLS algorithm in Non-Local Patch Regression

2 Mar 2013  ·  Kunal. N. Chaudhury ·

Recently, it was demonstrated in [CS2012,CS2013] that the robustness of the classical Non-Local Means (NLM) algorithm [BCM2005] can be improved by incorporating $\ell^p (0 < p \leq 2)$ regression into the NLM framework. This general optimization framework, called Non-Local Patch Regression (NLPR), contains NLM as a special case. Denoising results on synthetic and natural images show that NLPR consistently performs better than NLM beyond a moderate noise level, and significantly so when $p$ is close to zero. An iteratively reweighted least-squares (IRLS) algorithm was proposed for solving the regression problem in NLPR, where the NLM output was used to initialize the iterations. Based on exhaustive numerical experiments, we observe that the IRLS algorithm is globally convergent (for arbitrary initialization) in the convex regime $1 \leq p \leq 2$, and locally convergent (fails very rarely using NLM initialization) in the non-convex regime $0 < p < 1$. In this letter, we adapt the "majorize-minimize" framework introduced in [Voss1980] to explain these observations. [CS2012] Chaudhury et al. (2012), "Non-local Euclidean medians," IEEE Signal Processing Letters. [CS2013] Chaudhury et al. (2013), "Non-local patch regression: Robust image denoising in patch space," IEEE ICASSP. [BCM2005] Buades et al. (2005), "A review of image denoising algorithms, with a new one," Multiscale Modeling and Simulation. [Voss1980] Voss et al. (1980), "Linear convergence of generalized Weiszfeld's method," Computing.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here