Search Results for author: François Malgouyres

Found 9 papers, 0 papers with code

Quantized Approximately Orthogonal Recurrent Neural Networks

no code implementations5 Feb 2024 Armand Foucault, Franck Mamalet, François Malgouyres

Orthogonal recurrent neural networks (ORNNs) are an appealing option for learning tasks involving time series with long-term dependencies, thanks to their simplicity and computational stability.

Quantization Time Series

Support Exploration Algorithm for Sparse Support Recovery

no code implementations31 Jan 2023 Mimoun Mohamed, François Malgouyres, Valentin Emiya, Caroline Chaux

We introduce a new algorithm promoting sparsity called {\it Support Exploration Algorithm (SEA)} and analyze it in the context of support recovery/model selection problems. The algorithm can be interpreted as an instance of the {\it straight-through estimator (STE)} applied to the resolution of a sparse linear inverse problem.

Model Selection

Local Identifiability of Deep ReLU Neural Networks: the Theory

no code implementations15 Jun 2022 Joachim Bona-Pellissier, François Malgouyres, François Bachoc

Is a sample rich enough to determine, at least locally, the parameters of a neural network?

A general approximation lower bound in $L^p$ norm, with applications to feed-forward neural networks

no code implementations9 Jun 2022 El Mehdi Achour, Armand Foucault, Sébastien Gerchinovitz, François Malgouyres

Given two sets $F$, $G$ of real-valued functions, we first prove a general lower bound on how well functions in $F$ can be approximated in $L^p(\mu)$ norm by functions in $G$, for any $p \geq 1$ and any probability measure $\mu$.

Open-Ended Question Answering

Parameter identifiability of a deep feedforward ReLU neural network

no code implementations24 Dec 2021 Joachim Bona-Pellissier, François Bachoc, François Malgouyres

The possibility for one to recover the parameters-weights and biases-of a neural network thanks to the knowledge of its function on a subset of the input space can be, depending on the situation, a curse or a blessing.

Existence, Stability and Scalability of Orthogonal Convolutional Neural Networks

no code implementations12 Aug 2021 El Mehdi Achour, François Malgouyres, Franck Mamalet

Imposing orthogonality on the layers of neural networks is known to facilitate the learning by limiting the exploding/vanishing of the gradient; decorrelate the features; improve the robustness.

The loss landscape of deep linear neural networks: a second-order analysis

no code implementations28 Jul 2021 El Mehdi Achour, François Malgouyres, Sébastien Gerchinovitz

We characterize, among all critical points, which are global minimizers, strict saddle points, and non-strict saddle points.

Overestimation learning with guarantees

no code implementations26 Jan 2021 Adrien Gauffriau, François Malgouyres, Mélanie Ducoffe

Experiments on real data show that the method makes it possible to use the surrogate function in embedded systems for which an underestimation is critical; when computing the reference function requires too many resources.

Multilinear compressive sensing and an application to convolutional linear networks

no code implementations23 Mar 2017 François Malgouyres, Joseph Landsberg

In this paper, we provide necessary and sufficient conditions on the network topology under which a stability property holds.

Compressive Sensing

Cannot find the paper you are looking for? You can Submit a new open access paper.