On the inverse Potts functional for single-image super-resolution problems

19 Aug 2020  ·  Pasquale Cascarano, Luca Calatroni, Elena Loli Piccolomini ·

We consider a variational model for single-image super-resolution based on the assumption that the image gradient of the target image is sparse. To promote jump sparsity, we use an isotropic and anisotropic $\ell^{0}$ inverse Potts gradient regularisation term combined with a quadratic data fidelity, similarly as studied in [1] for general problems in signal recovery. For the numerical realisation of the model, we consider a converging ADMM algorithm. Differently from [1], [2], where approximate graph cuts and dynamic programming techniques were used for solving the non-convex substeps in the case of multivariate data, the proposed splitting allows to compute explicitly their solution by means of hard-thresholding and standard conjugate-gradient solvers. We compare quantitatively our results with several convex, nonconvex and deep-learning-based approaches for several synthetic and real-world data. Our numerical results show that combining super-resolution with gradient sparsity is particularly helpful for object detection and labelling tasks (such as QR scanning and land-cover classification), for which our results are shown to improve the classification precision of standard clustering algorithms and state-of-the art deep architectures [3].

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here