1 code implementation • 6 Oct 2023 • Maksym Andriushchenko, Francesco D'Angelo, Aditya Varre, Nicolas Flammarion
In this work, we highlight that the role of weight decay in modern deep learning is different from its regularization effect studied in classical learning theory.
1 code implementation • 23 Nov 2021 • Maria R. Cervera, Rafael Dätwyler, Francesco D'Angelo, Hamza Keurti, Benjamin F. Grewe, Christian Henning
Although neural networks are powerful function approximators, the underlying modelling assumptions ultimately define the likelihood and thus the hypothesis class they are parameterizing.
1 code implementation • 12 Oct 2021 • Francesco D'Angelo, Christian Henning
In this paper, we question this assumption and show that proper Bayesian inference with function space priors induced by neural networks does not necessarily lead to good OOD detection.
1 code implementation • 26 Jul 2021 • Christian Henning, Francesco D'Angelo, Benjamin F. Grewe
The need to avoid confident predictions on unfamiliar data has sparked interest in out-of-distribution (OOD) detection.
1 code implementation • NeurIPS 2021 • Francesco D'Angelo, Vincent Fortuin
Deep ensembles have recently gained popularity in the deep learning community for their conceptual simplicity and efficiency.
no code implementations • 20 Jun 2021 • Francesco D'Angelo, Vincent Fortuin, Florian Wenzel
Ensembles of deep neural networks have achieved great success recently, but they do not offer a proper Bayesian justification.
3 code implementations • NeurIPS 2021 • Christian Henning, Maria R. Cervera, Francesco D'Angelo, Johannes von Oswald, Regina Traber, Benjamin Ehret, Seijin Kobayashi, Benjamin F. Grewe, João Sacramento
We offer a practical deep learning implementation of our framework based on probabilistic task-conditioned hypernetworks, an approach we term posterior meta-replay.
no code implementations • pproximateinference AABI Symposium 2021 • Francesco D'Angelo, Vincent Fortuin
Particle based optimization algorithms have recently been developed as sampling methods that iteratively update a set of particles to approximate a target distribution.
1 code implementation • 15 Jan 2020 • Francesco D'Angelo, Lucas Böttcher
We also find that convolutional layers in VAEs are important to model spin correlations whereas RBMs achieve similar or even better performances without convolutional filters.