Representation Learning via Cauchy Convolutional Sparse Coding

Perla Mayo, Oktay Karakuş, Robin Holmes, Alin Achim

Research output: Contribution to journalArticle (Academic Journal)peer-review

63 Downloads (Pure)

Abstract

In representation learning, Convolutional Sparse Coding (CSC) enables unsupervised learning of features by jointly optimising both an \(\ell_2\)-norm fidelity term and a sparsity enforcing penalty. This work investigates using a regularisation term derived from an assumed Cauchy prior for the coefficients of the feature maps of a CSC generative model. The sparsity penalty term resulting from this prior is solved via its proximal operator, which is then applied iteratively, element-wise, on the coefficients of the feature maps to optimise the CSC cost function. The performance of the proposed Iterative Cauchy Thresholding (ICT) algorithm in reconstructing natural images is compared against the common choice of \(\ell_1\)-norm optimised via soft and hard thresholding. ICT outperforms IHT and IST in most of these reconstruction experiments across various datasets, with an average PSNR of up to 11.30 and 7.04 above ISTA and IHT respectively.
Original languageEnglish
Number of pages19
JournalarXiv
Publication statusUnpublished - 8 Aug 2020

Keywords

  • eess.IV
  • cs.CV
  • cs.LG

Fingerprint

Dive into the research topics of 'Representation Learning via Cauchy Convolutional Sparse Coding'. Together they form a unique fingerprint.

Cite this