2 code implementations • 21 Aug 2023 • Alexis Goujon, Sebastian Neumayer, Michael Unser
We propose to learn non-convex regularizers with a prescribed upper bound on their weak-convexity modulus.
2 code implementations • 22 Nov 2022 • Alexis Goujon, Sebastian Neumayer, Pakshal Bohra, Stanislas Ducotterd, Michael Unser
The emergence of deep-learning-based methods to solve image-reconstruction problems has enabled a significant increase in reconstruction quality.
1 code implementation • 28 Oct 2022 • Stanislas Ducotterd, Alexis Goujon, Pakshal Bohra, Dimitris Perdios, Sebastian Neumayer, Michael Unser
Lipschitz-constrained neural networks have several advantages over unconstrained ones and can be applied to a variety of problems, making them a topic of attention in the deep learning community.
1 code implementation • 16 Aug 2022 • Mehrsa Pourya, Alexis Goujon, Michael Unser
Rectified linear unit (ReLU) neural networks generate continuous and piecewise-linear (CPWL) mappings and are the state-of-the-art approach for solving regression problems.
no code implementations • 17 Jun 2022 • Alexis Goujon, Arian Etemadi, Michael Unser
We first provide upper and lower bounds on the maximal number of linear regions of a CPWL NN given its depth, width, and the number of linear regions of its activation functions.
no code implementations • 13 Apr 2022 • Sebastian Neumayer, Alexis Goujon, Pakshal Bohra, Michael Unser
Lipschitz-constrained neural networks have many applications in machine learning.
no code implementations • NeurIPS Workshop Deep_Invers 2021 • Pakshal Bohra, Alexis Goujon, Dimitris Perdios, Sébastien Emery, Michael Unser
We show that averaged denoising operators built from 1-Lipschitz deep spline networks consistently outperform those built from 1-Lipschitz ReLU networks.