Search Results for author: Franck Mamalet

Found 7 papers, 4 papers with code

Quantized Approximately Orthogonal Recurrent Neural Networks

no code implementations5 Feb 2024 Armand Foucault, Franck Mamalet, François Malgouyres

Orthogonal recurrent neural networks (ORNNs) are an appealing option for learning tasks involving time series with long-term dependencies, thanks to their simplicity and computational stability.

Quantization Time Series

DP-SGD Without Clipping: The Lipschitz Neural Network Way

1 code implementation25 May 2023 Louis Bethune, Thomas Massena, Thibaut Boissin, Yannick Prudent, Corentin Friedrich, Franck Mamalet, Aurelien Bellet, Mathieu Serrurier, David Vigouroux

To provide sensitivity bounds and bypass the drawbacks of the clipping process, we propose to rely on Lipschitz constrained networks.

On the explainable properties of 1-Lipschitz Neural Networks: An Optimal Transport Perspective

no code implementations NeurIPS 2023 Mathieu Serrurier, Franck Mamalet, Thomas Fel, Louis Béthune, Thibaut Boissin

Input gradients have a pivotal role in a variety of applications, including adversarial attack algorithms for evaluating model robustness, explainable AI techniques for generating Saliency Maps, and counterfactual explanations. However, Saliency Maps generated by traditional neural networks are often noisy and provide limited insights.

Adversarial Attack counterfactual +1

Existence, Stability and Scalability of Orthogonal Convolutional Neural Networks

no code implementations12 Aug 2021 El Mehdi Achour, François Malgouyres, Franck Mamalet

Imposing orthogonality on the layers of neural networks is known to facilitate the learning by limiting the exploding/vanishing of the gradient; decorrelate the features; improve the robustness.

Cannot find the paper you are looking for? You can Submit a new open access paper.