Search Results for author: Jérôme Bolte

Found 9 papers, 3 papers with code

Differentiating Nonsmooth Solutions to Parametric Monotone Inclusion Problems

no code implementations15 Dec 2022 Jérôme Bolte, Edouard Pauwels, Antonio José Silveti-Falls

We leverage path differentiability and a recent result on nonsmooth implicit differentiation calculus to give sufficient conditions ensuring that the solution to a monotone inclusion problem will be path differentiable, with formulas for computing its generalized gradient.

On the complexity of nonsmooth automatic differentiation

no code implementations1 Jun 2022 Jérôme Bolte, Ryan Boustany, Edouard Pauwels, Béatrice Pesquet-Popescu

Using the notion of conservative gradient, we provide a simple model to estimate the computational costs of the backward and forward modes of algorithmic differentiation for a wide class of nonsmooth programs.

Automatic differentiation of nonsmooth iterative algorithms

no code implementations31 May 2022 Jérôme Bolte, Edouard Pauwels, Samuel Vaiter

Is there a limiting object for nonsmooth piggyback automatic differentiation (AD)?

Numerical influence of ReLU'(0) on backpropagation

1 code implementation NeurIPS 2021 David Bertoin, Jérôme Bolte, Sébastien Gerchinovitz, Edouard Pauwels

In theory, the choice of ReLU(0) in [0, 1] for a neural network has a negligible influence both on backpropagation and training.

Nonsmooth Implicit Differentiation for Machine Learning and Optimization

no code implementations NeurIPS 2021 Jérôme Bolte, Tam Le, Edouard Pauwels, Antonio Silveti-Falls

In view of training increasingly complex learning architectures, we establish a nonsmooth implicit function theorem with an operational calculus.

BIG-bench Machine Learning

Second-order step-size tuning of SGD for non-convex optimization

1 code implementation5 Mar 2021 Camille Castera, Jérôme Bolte, Cédric Févotte, Edouard Pauwels

In view of a direct and simple improvement of vanilla SGD, this paper presents a fine-tuning of its step-sizes in the mini-batch case.

A Hölderian backtracking method for min-max and min-min problems

no code implementations17 Jul 2020 Jérôme Bolte, Lilian Glaudin, Edouard Pauwels, Mathieu Serrurier

We present a new algorithm to solve min-max or min-min problems out of the convex world.

Conservative set valued fields, automatic differentiation, stochastic gradient method and deep learning

no code implementations23 Sep 2019 Jérôme Bolte, Edouard Pauwels

Modern problems in AI or in numerical analysis require nonsmooth approaches with a flexible calculus.

Cannot find the paper you are looking for? You can Submit a new open access paper.