Search Results for author: Michael Unser

Found 46 papers, 10 papers with code

Function-Space Optimality of Neural Architectures With Multivariate Nonlinearities

no code implementations5 Oct 2023 Rahul Parhi, Michael Unser

We investigate the function-space optimality (specifically, the Banach-space optimality) of a large class of shallow neural architectures with multivariate nonlinearities/activation functions.

Learning Weakly Convex Regularizers for Convergent Image-Reconstruction Algorithms

2 code implementations21 Aug 2023 Alexis Goujon, Sebastian Neumayer, Michael Unser

We propose to learn non-convex regularizers with a prescribed upper bound on their weak-convexity modulus.

MRI Reconstruction

Mechanical Artifacts in Optical Projection Tomography: Classification and Automatic Calibration

no code implementations19 Jul 2023 Yan Liu, Jonathan Dong, Thanh-an Pham, Francois Marelli, Michael Unser

Then, we introduce a calibration algorithm that recovers the unknown system parameters fed into the final 3D iterative reconstruction algorithm for a distortion-free volumetric image.

Optical Diffraction Tomography Meets Fluorescence Localization Microscopy

no code implementations18 Jul 2023 Thanh-an Pham, Emmanuel Soubies, Ferréol Soulez, Michael Unser

We show that structural information can be extracted from single molecule localization microscopy (SMLM) data.

Position

On the Effect of Initialization: The Scaling Path of 2-Layer Neural Networks

no code implementations31 Mar 2023 Sebastian Neumayer, Lénaïc Chizat, Michael Unser

In supervised learning, the regularization path is sometimes used as a convenient theoretical proxy for the optimization path of gradient descent initialized from zero.

A Neural-Network-Based Convex Regularizer for Inverse Problems

2 code implementations22 Nov 2022 Alexis Goujon, Sebastian Neumayer, Pakshal Bohra, Stanislas Ducotterd, Michael Unser

The emergence of deep-learning-based methods to solve image-reconstruction problems has enabled a significant increase in reconstruction quality.

Denoising MRI Reconstruction

Self-Supervised Isotropic Superresolution Fetal Brain MRI

no code implementations11 Nov 2022 Kay Lächler, Hélène Lajous, Michael Unser, Meritxell Bach Cuadra, Pol del Aguila Pla

In this paper, we sidestep this difficulty by providing a proof of concept of a self-supervised single-volume superresolution framework for T2-weighted FBMRI (SAIR).

Anatomy Image Reconstruction

From Nano to Macro: Overview of the IEEE Bio Image and Signal Processing Technical Committee

no code implementations31 Oct 2022 Selin Aviyente, Alejandro Frangi, Erik Meijering, Arrate Muñoz-Barrutia, Michael Liebling, Dimitri Van De Ville, Jean-Christophe Olivo-Marin, Jelena Kovačević, Michael Unser

The Bio Image and Signal Processing (BISP) Technical Committee (TC) of the IEEE Signal Processing Society (SPS) promotes activities within the broad technical field of biomedical image and signal processing.

Improving Lipschitz-Constrained Neural Networks by Learning Activation Functions

1 code implementation28 Oct 2022 Stanislas Ducotterd, Alexis Goujon, Pakshal Bohra, Dimitris Perdios, Sebastian Neumayer, Michael Unser

Lipschitz-constrained neural networks have several advantages over unconstrained ones and can be applied to a variety of problems, making them a topic of attention in the deep learning community.

Delaunay-Triangulation-Based Learning with Hessian Total-Variation Regularization

1 code implementation16 Aug 2022 Mehrsa Pourya, Alexis Goujon, Michael Unser

Rectified linear unit (ReLU) neural networks generate continuous and piecewise-linear (CPWL) mappings and are the state-of-the-art approach for solving regression problems.

regression

From Kernel Methods to Neural Networks: A Unifying Variational Formulation

no code implementations29 Jun 2022 Michael Unser

By contrast, for the total-variation norm, the solution takes the form of a two-layer neural network with an activation function that is determined by the regularization operator.

On the Number of Regions of Piecewise Linear Neural Networks

no code implementations17 Jun 2022 Alexis Goujon, Arian Etemadi, Michael Unser

We first provide upper and lower bounds on the maximal number of linear regions of a CPWL NN given its depth, width, and the number of linear regions of its activation functions.

Stability of Image-Reconstruction Algorithms

no code implementations14 Jun 2022 Pol del Aguila Pla, Sebastian Neumayer, Michael Unser

Robustness and stability of image-reconstruction algorithms have recently come under scrutiny.

Image Reconstruction

Asymptotic Stability in Reservoir Computing

1 code implementation7 Jun 2022 Jonathan Dong, Erik Börve, Mushegh Rafayelyan, Michael Unser

Reservoir Computing is a class of Recurrent Neural Networks with internal weights fixed at random.

Bona fide Riesz projections for density estimation

no code implementations28 Apr 2022 P. del Aguila Pla, Michael Unser

The projection of sample measurements onto a reconstruction space represented by a basis on a regular grid is a powerful and simple approach to estimate a probability density function.

Density Estimation

Approximation of Lipschitz Functions using Deep Spline Neural Networks

no code implementations13 Apr 2022 Sebastian Neumayer, Alexis Goujon, Pakshal Bohra, Michael Unser

Lipschitz-constrained neural networks have many applications in machine learning.

Bayesian Inversion for Nonlinear Imaging Models using Deep Generative Priors

no code implementations18 Mar 2022 Pakshal Bohra, Thanh-an Pham, Jonathan Dong, Michael Unser

In this work, we present a Bayesian reconstruction framework for nonlinear imaging models where we specify the prior knowledge on the image through a deep generative model.

Retrieval

A Statistical Framework to Investigate the Optimality of Signal-Reconstruction Methods

no code implementations18 Mar 2022 Pakshal Bohra, Pol del Aguila Pla, Jean-François Giovannelli, Michael Unser

We present a statistical framework to benchmark the performance of reconstruction algorithms for linear inverse problems, in particular, neural-network-based methods that require large quantities of training data.

Benchmarking

Ridges, Neural Networks, and the Radon Transform

no code implementations4 Mar 2022 Michael Unser

Ridges appear in the theory of neural networks as functional descriptors of the effect of a neuron, with the direction vector being encoded in the linear weights.

Coupled Splines for Sparse Curve Fitting

no code implementations3 Feb 2022 Icíar LLoréns Jover, Thomas Debarre, Shayan Aziznejad, Michael Unser

We prove that an optimal solution to the inverse problem is a closed curve with spline components.

Diffraction Tomography with Helmholtz Equation: Efficient and Robust Multigrid-Based Solver

no code implementations8 Jul 2021 Tao Hong, Thanh-an Pham, Eran Treister, Michael Unser

In this work, we introduce instead a Helmholtz-based nonlinear model for inverse scattering.

Continuous-Domain Formulation of Inverse Problems for Composite Sparse-Plus-Smooth Signals

no code implementations24 Mar 2021 Thomas Debarre, Shayan Aziznejad, Michael Unser

We present a novel framework for the reconstruction of 1D composite signals assumed to be a mixture of two additive components, one sparse and the other smooth, given a finite number of linear measurements.

Optimal-transport-based metric for SMLM

no code implementations26 Oct 2020 Quentin Denoyelle, Thanh-an Pham, Pol del Aguila Pla, Daniel Sage, Michael Unser

We propose the use of Flat Metric to assess the performance of reconstruction methods for single-molecule localization microscopy (SMLM) in scenarios where the ground-truth is available.

Deep Neural Networks with Trainable Activations and Controlled Lipschitz Constant

no code implementations17 Jan 2020 Shayan Aziznejad, Harshit Gupta, Joaquim Campos, Michael Unser

To that end, we first establish a global bound for the Lipschitz constant of neural networks.

Time-Dependent Deep Image Prior for Dynamic MRI

1 code implementation3 Oct 2019 Jaejun Yoo, Kyong Hwan Jin, Harshit Gupta, Jerome Yerly, Matthias Stuber, Michael Unser

The key ingredients of our method are threefold: 1) a fixed low-dimensional manifold that encodes the temporal variations of images; 2) a network that maps the manifold into a more expressive latent space; and 3) a convolutional neural network that generates a dynamic series of MRI images from the latent variables and that favors their consistency with the measurements in k-space.

MRI Reconstruction

Native Banach spaces for splines and variational inverse problems

no code implementations24 Apr 2019 Michael Unser, Julien Fageot

In short, the native space for ${\rm L}$ and the (dual) norm $\|\cdot\|_{\mathcal{X}'}$ is the largest space of functions $f: \mathbb{R}^d \to \mathbb{R}$ such that $\|{\rm L} f\|_{\mathcal{X}'}<\infty$, subject to the constraint that the growth-restricted null space of ${\rm L}$be finite-dimensional.

A unifying representer theorem for inverse problems and machine learning

no code implementations2 Mar 2019 Michael Unser

We then use our theorem to retrieve a number of known results in the literature---e. g., the celebrated representer theorem of machine leaning for RKHS, Tikhonov regularization, representer theorems for sparsity promoting functionals, the recovery of spikes---as well as a few new ones.

BIG-bench Machine Learning

Self-Supervised Deep Active Accelerated MRI

no code implementations14 Jan 2019 Kyong Hwan Jin, Michael Unser, Kwang Moo Yi

The reconstruction network is trained to give the highest reconstruction quality, given the MCTS sampling pattern.

Multi-Kernel Regression with Sparsity Constraint

no code implementations2 Nov 2018 Shayan Aziznejad, Michael Unser

In this paper, we provide a Banach-space formulation of supervised learning with generalized total-variation (gTV) regularization.

regression

Fast Rotational Sparse Coding

no code implementations12 Jun 2018 Michael T. McCann, Vincent Andrearczyk, Michael Unser, Adrien Depeursinge

In this work, we propose an algorithm for a rotational version of sparse coding that is based on K-SVD with additional rotation operations.

Dictionary Learning Texture Classification

Diverse M-Best Solutions by Dynamic Programming

no code implementations15 Mar 2018 Carsten Haubold, Virginie Uhlmann, Michael Unser, Fred A. Hamprecht

Many computer vision pipelines involve dynamic programming primitives such as finding a shortest path or the minimum energy solution in a tree-shaped probabilistic graphical model.

object-detection Object Detection

A representer theorem for deep neural networks

no code implementations26 Feb 2018 Michael Unser

We propose to optimize the activation functions of a deep neural network by adding a corresponding functional regularization to the cost function.

Fast Piecewise-Affine Motion Estimation Without Segmentation

no code implementations6 Feb 2018 Denis Fortun, Martin Storath, Dennis Rickert, Andreas Weinmann, Michael Unser

Current algorithmic approaches for piecewise affine motion estimation are based on alternating motion segmentation and estimation.

Motion Estimation Motion Segmentation +1

A Review of Convolutional Neural Networks for Inverse Problems in Imaging

2 code implementations11 Oct 2017 Michael T. McCann, Kyong Hwan Jin, Michael Unser

In this survey paper, we review recent uses of convolution neural networks (CNNs) to solve inverse problems in imaging.

Denoising Image Reconstruction +1

Dictionary Learning Based on Sparse Distribution Tomography

no code implementations ICML 2017 Pedram Pad, Farnood Salehi, Elisa Celis, Patrick Thiran, Michael Unser

We propose a new statistical dictionary learning algorithm for sparse signals that is based on an $\alpha$-stable innovation model.

Dictionary Learning Image Denoising

Efficient Inversion of Multiple-Scattering Model for Optical Diffraction Tomography

1 code implementation11 Jul 2017 Emmanuel Soubies, Thanh-an Pham, Michael Unser

Optical diffraction tomography relies on solving an inverse scattering problem governed by the wave equation.

Computational Engineering, Finance, and Science Numerical Analysis Data Analysis, Statistics and Probability Optics

Learning Convex Regularizers for Optimal Bayesian Denoising

no code implementations16 May 2017 Ha Q. Nguyen, Emrah Bostan, Michael Unser

We propose a data-driven algorithm for the maximum a posteriori (MAP) estimation of stochastic processes from noisy observations.

Denoising

Deep Convolutional Neural Network for Inverse Problems in Imaging

no code implementations11 Nov 2016 Kyong Hwan Jin, Michael T. McCann, Emmanuel Froustey, Michael Unser

The starting point of our work is the observation that unrolled iterative methods have the form of a CNN (filtering followed by point-wise non-linearity) when the normal operator (H*H, the adjoint of H times H) of the forward model is a convolution.

On The Continuous Steering of the Scale of Tight Wavelet Frames

no code implementations7 Dec 2015 Zsuzsanna Püspöki, John Paul Ward, Daniel Sage, Michael Unser

In analogy with steerable wavelets, we present a general construction of adaptable tight wavelet frames, with an emphasis on scaling operations.

Approximate Message Passing with Consistent Parameter Estimation and Applications to Sparse Learning

no code implementations NeurIPS 2012 Ulugbek Kamilov, Sundeep Rangan, Michael Unser, Alyson K. Fletcher

We present a method, called adaptive generalized approximate message passing (Adaptive GAMP), that enables joint learning of the statistics of the prior and measurement channel along with estimation of the unknown vector $\xbf$.

Sparse Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.