Search Results for author: Paul Novello

Found 10 papers, 4 papers with code

Out-of-Distribution Detection Should Use Conformal Prediction (and Vice-versa?)

no code implementations18 Mar 2024 Paul Novello, Joseba Dalmau, Léo Andeol

Based on the work of (Bates et al., 2022), we define new conformal AUROC and conformal FRP@TPR95 metrics, which are corrections that provide probabilistic conservativeness guarantees on the variability of these metrics.

Anomaly Detection Conformal Prediction +2

Unlocking Feature Visualization for Deeper Networks with MAgnitude Constrained Optimization

1 code implementation11 Jun 2023 Thomas Fel, Thibaut Boissin, Victor Boutin, Agustin Picard, Paul Novello, Julien Colin, Drew Linsley, Tom Rousseau, Rémi Cadène, Laurent Gardes, Thomas Serre

However, its widespread adoption has been limited due to a reliance on tricks to generate interpretable images, and corresponding challenges in scaling it to deeper neural networks.

Robust One-Class Classification with Signed Distance Function using 1-Lipschitz Neural Networks

1 code implementation26 Jan 2023 Louis Bethune, Paul Novello, Thibaut Boissin, Guillaume Coiffier, Mathieu Serrurier, Quentin Vincenot, Andres Troya-Galvis

The distance to the support can be interpreted as a normality score, and its approximation using 1-Lipschitz neural networks provides robustness bounds against $l2$ adversarial attacks, an under-explored weakness of deep learning-based OCC algorithms.

Image Generation One-Class Classification

Accelerating hypersonic reentry simulations using deep learning-based hybridization (with guarantees)

no code implementations27 Sep 2022 Paul Novello, Gaël Poëtte, David Lugato, Simon Peluchon, Pietro Marco Congedo

To tackle this trade-off, we design a hybrid simulation code coupling a traditional fluid dynamic solver with a neural network approximating the chemical reactions.

Dimensionality Reduction

Goal-Oriented Sensitivity Analysis of Hyperparameters in Deep Learning

1 code implementation13 Jul 2022 Paul Novello, Gaël Poëtte, David Lugato, Pietro Marco Congedo

In this work, we study the use of goal-oriented sensitivity analysis, based on the Hilbert-Schmidt Independence Criterion (HSIC), for hyperparameter analysis and optimization.

BIG-bench Machine Learning Hyperparameter Optimization

Making Sense of Dependence: Efficient Black-box Explanations Using Dependence Measure

1 code implementation13 Jun 2022 Paul Novello, Thomas Fel, David Vigouroux

HSIC measures the dependence between regions of an input image and the output of a model based on kernel embeddings of distributions.

object-detection Object Detection

Leveraging Local Variation in Data: Sampling and Weighting Schemes for Supervised Deep Learning

no code implementations19 Jan 2021 Paul Novello, Gaël Poëtte, David Lugato, Pietro Congedo

In the context of supervised learning of a function by a neural network, we claim and empirically verify that the neural network yields better results when the distribution of the data set focuses on regions where the function to learn is steep.

Variance Based Sample Weighting for Supervised Learning

no code implementations1 Jan 2021 Paul Novello, Gaël Poëtte, David Lugato, Pietro Congedo

In the context of supervised learning of a function by a Neural Network (NN), we claim and empirically justify that a NN yields better results when the distribution of the data set focuses on regions where the function to learn is steeper.

Cannot find the paper you are looking for? You can Submit a new open access paper.