Convergence Guarantees for Non-Convex Optimisation with Cauchy-Based Penalties

Research output: Contribution to journalArticle (Academic Journal)peer-review

29 Citations (Scopus)
1 Downloads (Pure)

Abstract

In this paper, we propose a convex proximal splitting methodology with a non-convex penalty function based on the heavy-tailed Cauchy distribution. We first suggest a closed-form expression for calculating the proximal operator of the Cauchy prior, which then makes it applicable in generic proximal splitting algorithms. We further derive the required condition for minimisation problems with the Cauchy based penalty function that guarantees convergence to the global minimum even though it is non-convex. Setting the system parameters by satisfying the proposed condition keeps the overall cost function convex and it can be minimised via the forward-backward (FB) algorithm. The proposed method based on Cauchy regularisation is evaluated by solving two generic signal processing examples, i.e. 1D signal denoising in the frequency domain and two image reconstruction tasks including de-blurring and denoising. We experimentally verify the proposed convexity conditions for various cases, and show the effectiveness of the proposed Cauchy based non-convex penalt y function over state-of-the-art penalty functions such as L1 and total variation (TV) norms.
Original languageEnglish
Number of pages15
JournalIEEE Transactions on Signal Processing
Volume68
Issue number1
DOIs
Publication statusPublished - 21 Oct 2020

Keywords

  • Non-convex regularisation
  • Convex optimisation
  • Cauchy proximal operator
  • Inverse problems
  • Denoising
  • Image reconstruction

Fingerprint

Dive into the research topics of 'Convergence Guarantees for Non-Convex Optimisation with Cauchy-Based Penalties'. Together they form a unique fingerprint.

Cite this