Convergence Guarantees for Non-Convex Optimisation with Cauchy-Based Penalties

10 Mar 2020  ·  Oktay Karakus, Perla Mayo, Alin Achim ·

In this paper, we propose a proximal splitting methodology with a non-convex penalty function based on the heavy-tailed Cauchy distribution. We first suggest a closed-form expression for calculating the proximal operator of the Cauchy prior, which then makes it applicable in generic proximal splitting algorithms. We further derive the condition required for guaranteed convergence to a solution in optimisation problems involving the Cauchy based penalty function. Setting the system parameters by satisfying the proposed condition ensures convergence even though the overall cost function is non-convex when minimisation is performed via a proximal splitting algorithm. The proposed method based on Cauchy regularisation is evaluated by solving generic signal processing examples, i.e. 1D signal denoising in the frequency domain, two image reconstruction tasks including de-blurring and denoising, and error recovery in a multiple-antenna communication system. We experimentally verify the proposed convergence conditions for various cases, and show the effectiveness of the proposed Cauchy based non-convex penalty function over state-of-the-art penalty functions such as $L_1$ and total variation ($TV$) norms.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here