no code implementations • 30 Jun 2021 • Giulio Franzese, Dimitrios Milios, Maurizio Filippone, Pietro Michiardi
We revisit the theoretical properties of Hamiltonian stochastic differential equations (SDES) for Bayesian posterior sampling, and we study the two types of errors that arise from numerical SDE simulation: the discretization error and the error due to noisy gradient estimates in the context of data subsampling.
no code implementations • NeurIPS 2021 • Ba-Hien Tran, Simone Rossi, Dimitrios Milios, Pietro Michiardi, Edwin V. Bonilla, Maurizio Filippone
We develop a novel method for carrying out model selection for Bayesian autoencoders (BAEs) by means of prior hyper-parameter optimization.
no code implementations • 25 Nov 2020 • Ba-Hien Tran, Simone Rossi, Dimitrios Milios, Maurizio Filippone
This poses a challenge because modern neural networks are characterized by a large number of parameters, and the choice of these priors has an uncontrolled effect on the induced functional prior, which is the distribution of the functions obtained by sampling the parameters from their prior distribution.
no code implementations • pproximateinference AABI Symposium 2021 • Dimitrios Milios, Pietro Michiardi, Maurizio Filippone
In this paper, we employ variational arguments to establish a connection between ensemble methods for Neural Networks and Bayesian inference.
no code implementations • pproximateinference AABI Symposium 2021 • Ba-Hien Tran, Dimitrios Milios, Simone Rossi, Maurizio Filippone
The Bayesian treatment of neural networks dictates that a prior distribution is considered over the weight and bias parameters of the network.
no code implementations • 10 Nov 2020 • Gia-Lac Tran, Dimitrios Milios, Pietro Michiardi, Maurizio Filippone
In this work, we address one limitation of sparse GPs, which is due to the challenge in dealing with a large number of inducing variables without imposing a special structure on the inducing inputs.
no code implementations • 9 Jun 2020 • Giulio Franzese, Rosa Candela, Dimitrios Milios, Maurizio Filippone, Pietro Michiardi
In this work we define a unified mathematical framework to deepen our understanding of the role of stochastic gradient (SG) noise on the behavior of Markov chain Monte Carlo sampling (SGMCMC) algorithms.
no code implementations • 8 Jun 2020 • Dimitrios Milios, Pietro Michiardi, Maurizio Filippone
In this paper, we employ variational arguments to establish a connection between ensemble methods for Neural Networks and Bayesian inference.
1 code implementation • NeurIPS 2018 • Dimitrios Milios, Raffaello Camoriano, Pietro Michiardi, Lorenzo Rosasco, Maurizio Filippone
In this paper, we study the problem of deriving fast and accurate classification algorithms with uncertainty quantification.
no code implementations • 3 Jun 2016 • Michalis Michaelides, Dimitrios Milios, Jane Hillston, Guido Sanguinetti
Dynamical systems with large state-spaces are often expensive to thoroughly explore experimentally.