no code implementations • 26 May 2025 • Baptiste Abélès, Eugenio Clerico, Hamish Flynn, Gergely Neu
assumption on the observation noise.
no code implementations • 23 Apr 2025 • Eugenio Clerico, Hamish Flynn, Wojciech Kotłowski, Gergely Neu
We develop a methodology for constructing confidence sets for parameters of statistical models via a reduction to sequential prediction.
no code implementations • 11 Oct 2024 • Baptiste Abélès, Eugenio Clerico, Gergely Neu
Traditional generalization results in statistical learning require a training data set made of independently drawn examples.
no code implementations • 18 Jun 2024 • Baptiste Abeles, Eugenio Clerico, Gergely Neu
We study the generalization error of statistical learning algorithms in a non-i. i. d.
no code implementations • 20 Dec 2023 • Eugenio Clerico, Benjamin Guedj
We establish explicit dynamics for neural networks whose training objective has a regularising term that constrains the parameters to remain close to their initial value.
no code implementations • 6 Sep 2022 • Eugenio Clerico, Tyler Farghly, George Deligiannidis, Benjamin Guedj, Arnaud Doucet
We establish disintegrated PAC-Bayesian generalisation bounds for models trained with gradient descent methods or continuous gradient flows.
no code implementations • 2 Mar 2022 • Eugenio Clerico, Amitis Shidani, George Deligiannidis, Arnaud Doucet
This work discusses how to derive upper bounds for the expected generalisation error of supervised learning algorithms by means of the chaining technique.
1 code implementation • 22 Oct 2021 • Eugenio Clerico, George Deligiannidis, Arnaud Doucet
Recent studies have empirically investigated different methods to train stochastic neural networks on a classification task by optimising a PAC-Bayesian bound via stochastic gradient descent.
1 code implementation • 17 Jun 2021 • Eugenio Clerico, George Deligiannidis, Arnaud Doucet
The limit of infinite width allows for substantial simplifications in the analytical study of over-parameterised neural networks.
no code implementations • 24 Oct 2020 • Soufiane Hayou, Eugenio Clerico, Bobby He, George Deligiannidis, Arnaud Doucet, Judith Rousseau
Deep ResNet architectures have achieved state of the art performance on many tasks.