Search Results for author: Eugenio Clerico

Found 10 papers, 2 papers with code

Confidence Sequences for Generalized Linear Models via Regret Analysis

no code implementations23 Apr 2025 Eugenio Clerico, Hamish Flynn, Wojciech Kotłowski, Gergely Neu

We develop a methodology for constructing confidence sets for parameters of statistical models via a reduction to sequential prediction.

Online-to-PAC generalization bounds under graph-mixing dependencies

no code implementations11 Oct 2024 Baptiste Abélès, Eugenio Clerico, Gergely Neu

Traditional generalization results in statistical learning require a training data set made of independently drawn examples.

Generalization Bounds

A note on regularised NTK dynamics with an application to PAC-Bayesian training

no code implementations20 Dec 2023 Eugenio Clerico, Benjamin Guedj

We establish explicit dynamics for neural networks whose training objective has a regularising term that constrains the parameters to remain close to their initial value.

Generalisation under gradient descent via deterministic PAC-Bayes

no code implementations6 Sep 2022 Eugenio Clerico, Tyler Farghly, George Deligiannidis, Benjamin Guedj, Arnaud Doucet

We establish disintegrated PAC-Bayesian generalisation bounds for models trained with gradient descent methods or continuous gradient flows.

Chained Generalisation Bounds

no code implementations2 Mar 2022 Eugenio Clerico, Amitis Shidani, George Deligiannidis, Arnaud Doucet

This work discusses how to derive upper bounds for the expected generalisation error of supervised learning algorithms by means of the chaining technique.

Conditionally Gaussian PAC-Bayes

1 code implementation22 Oct 2021 Eugenio Clerico, George Deligiannidis, Arnaud Doucet

Recent studies have empirically investigated different methods to train stochastic neural networks on a classification task by optimising a PAC-Bayesian bound via stochastic gradient descent.

Wide stochastic networks: Gaussian limit and PAC-Bayesian training

1 code implementation17 Jun 2021 Eugenio Clerico, George Deligiannidis, Arnaud Doucet

The limit of infinite width allows for substantial simplifications in the analytical study of over-parameterised neural networks.

Stable ResNet

no code implementations24 Oct 2020 Soufiane Hayou, Eugenio Clerico, Bobby He, George Deligiannidis, Arnaud Doucet, Judith Rousseau

Deep ResNet architectures have achieved state of the art performance on many tasks.

Cannot find the paper you are looking for? You can Submit a new open access paper.