no code implementations • 1 Feb 2023 • Mathieu Chalvidal, Thomas Serre, Rufin VanRullen
Research in Machine Learning has polarized into two general regression approaches: Transductive methods derive estimates directly from available data but are usually problem unspecific.
1 code implementation • 9 Jun 2022 • Thomas Fel, Lucas Hervier, David Vigouroux, Antonin Poche, Justin Plakoo, Remi Cadene, Mathieu Chalvidal, Julien Colin, Thibaut Boissin, Louis Bethune, Agustin Picard, Claire Nicodeme, Laurent Gardes, Gregory Flandin, Thomas Serre
Today's most advanced machine-learning models are hardly scrutable.
no code implementations • 4 Feb 2022 • Mathieu Chalvidal, Thomas Serre, Rufin VanRullen
Deep Reinforcement Learning has demonstrated the potential of neural networks tuned with gradient descent for solving complex tasks in well-delimited environments.
1 code implementation • NeurIPS 2021 • Thomas Fel, Remi Cadene, Mathieu Chalvidal, Matthieu Cord, David Vigouroux, Thomas Serre
We describe a novel attribution method which is grounded in Sensitivity Analysis and uses Sobol indices.
1 code implementation • 6 May 2021 • Matthew Ricci, Minju Jung, Yuwei Zhang, Mathieu Chalvidal, Aneri Soni, Thomas Serre
Here, we present a single approach to both of these problems in the form of "KuraNet", a deep-learning-based system of coupled oscillators that can learn to synchronize across a distribution of disordered network conditions.
no code implementations • ICLR 2021 • Mathieu Chalvidal, Matthew Ricci, Rufin VanRullen, Thomas Serre
Despite their elegant formulation and lightweight memory cost, neural ordinary differential equations (NODEs) suffer from known representational limitations.