no code implementations • 4 Apr 2024 • Michael Sucker, Jalal Fadili, Peter Ochs
We use the PAC-Bayesian theory for the setting of learning-to-optimize.
no code implementations • 8 Mar 2024 • Nathan Buskulic, Jalal Fadili, Yvain Quéau
Advanced machine learning methods, and more prominently neural networks, have become standard to solve inverse problems over the last years.
no code implementations • 21 Sep 2023 • Nathan Buskulic, Jalal Fadili, Yvain Quéau
Neural networks have become a prominent approach to solve inverse problems in recent years.
no code implementations • 20 Mar 2023 • Nathan Buskulic, Yvain Quéau, Jalal Fadili
Neural networks have become a prominent approach to solve inverse problems in recent years.
no code implementations • 17 Oct 2022 • Jean-Jacques Godeme, Jalal Fadili, Xavier Buet, Myriam Zerrad, Michel Lequime, Claude Amra
In this paper, we consider the problem of phase retrieval, which consists of recovering an $n$-dimensional real vector from the magnitude of its $m$ linear measurements.
no code implementations • 22 Dec 2021 • Antonio Silveti-Falls, Cesare Molinari, Jalal Fadili
Under slightly stricter assumptions, we show almost sure weak convergence of the pointwise iterates to a saddle point.
no code implementations • 24 Dec 2020 • Mahesh Chandra Mukkamala, Jalal Fadili, Peter Ochs
We fix this issue by proposing the MAP property, which generalizes the $L$-smad property and is also valid for a large class of nonconvex nonsmooth composite problems.
no code implementations • 11 May 2020 • Antonio Silveti-Falls, Cesare Molinari, Jalal Fadili
In this paper we propose and analyze inexact and stochastic versions of the CGALP algorithm developed in the authors' previous paper, which we denote ICGALP, that allows for errors in the computation of several important quantities.
no code implementations • 11 Feb 2020 • Kelvin Shuangjian Zhang, Gabriel Peyré, Jalal Fadili, Marcelo Pereyra
In this paper, we consider Langevin diffusions on a Hessian-type manifold and study a discretization that is closely related to the mirror-descent scheme.
1 code implementation • 8 Feb 2020 • José G. Gómez García, Jalal Fadili, Christophe Chesneau
In this paper, we consider a model called CHARME (Conditional Heteroscedastic Autoregressive Mixture of Experts), a class of generalized mixture of nonlinear nonparametric AR-ARCH time series.
1 code implementation • 11 Jul 2017 • Jalal Fadili, Jérôme Malick, Gabriel Peyré
This pairing is crucial to track the strata that are identifiable by solutions of parametrized optimization problems or by iterates of optimization algorithms.
no code implementations • NeurIPS 2016 • Kévin Degraux, Gabriel Peyré, Jalal Fadili, Laurent Jacques
More precisely, we focus in detail on the cases of $\ell_1$ and $\ell_\infty$ losses, and contrast them with the usual $\ell_2$ loss. While these losses are routinely used to account for either sparse ($\ell_1$ loss) or uniform ($\ell_\infty$ loss) noise models, a theoretical analysis of their performance is still lacking.
no code implementations • NeurIPS 2016 • Jingwei Liang, Jalal Fadili, Gabriel Peyré
In this paper, we propose a multi-step inertial Forward--Backward splitting algorithm for minimizing the sum of two non-necessarily convex functions, one of which is proper lower semi-continuous while the other is differentiable with a Lipschitz continuous gradient.
no code implementations • NeurIPS 2014 • Jingwei Liang, Jalal Fadili, Gabriel Peyré
In this paper, we consider the Forward--Backward proximal splitting algorithm to minimize the sum of two proper closed convex functions, one of which having a Lipschitz continuous gradient and the other being partly smooth relatively to an active manifold $\mathcal{M}$.
no code implementations • 12 Apr 2013 • Simon Beckouche, Jean-Luc Starck, Jalal Fadili
Astronomical images suffer a constant presence of multiple defects that are consequences of the intrinsic properties of the acquisition equipments, and atmospheric conditions.