Regularizing linear inverse problems with convolutional neural networks

6 Jul 2019  ·  Reinhard Heckel ·

Deep convolutional neural networks trained on large datsets have emerged as an intriguing alternative for compressing images and solving inverse problems such as denoising and compressive sensing. However, it has only recently been realized that even without training, convolutional networks can function as concise image models, and thus regularize inverse problems. In this paper, we provide further evidence for this finding by studying variations of convolutional neural networks that map few weight parameters to an image. The networks we consider only consist of convolutional operations, with either fixed or parameterized filters followed by ReLU non-linearities. We demonstrate that with both fixed and parameterized convolutional filters those networks enable representing images with few coefficients. What is more, the underparameterization enables regularization of inverse problems, in particular recovering an image from few observations. We show that, similar to standard compressive sensing guarantees, on the order of the number of model parameters many measurements suffice for recovering an image from compressive measurements. Finally, we demonstrate that signal recovery with a un-trained convolutional network outperforms standard l1 and total variation minimization for magnetic resonance imaging (MRI).

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods