Search Results for author: Maurizio Filippone

Found 38 papers, 10 papers with code

How Much is Enough? A Study on Diffusion Times in Score-based Generative Models

no code implementations10 Jun 2022 Giulio Franzese, Simone Rossi, Lixuan Yang, Alessandro Finamore, Dario Rossi, Maurizio Filippone, Pietro Michiardi

Score-based diffusion models are a class of generative models whose dynamics is described by stochastic differential equations that map noise into data.

Local Random Feature Approximations of the Gaussian Kernel

1 code implementation12 Apr 2022 Jonas Wacker, Maurizio Filippone

A fundamental drawback of kernel-based statistical models is their limited scalability to large data sets, which requires resorting to approximations.

Complex-to-Real Random Features for Polynomial Kernels

1 code implementation4 Feb 2022 Jonas Wacker, Ruben Ohana, Maurizio Filippone

Kernel methods are ubiquitous in statistical modeling due to their theoretical guarantees as well as their competitive empirical performance.

Improved Random Features for Dot Product Kernels

no code implementations21 Jan 2022 Jonas Wacker, Motonobu Kanagawa, Maurizio Filippone

These variance formulas elucidate conditions under which certain approximations (e. g., TensorSRHT) achieve lower variances than others (e. g., Rademacher sketches), and conditions under which the use of complex features leads to lower variances than real features.

Computer Vision Natural Language Processing +1

Revisiting the Effects of Stochasticity for Hamiltonian Samplers

no code implementations30 Jun 2021 Giulio Franzese, Dimitrios Milios, Maurizio Filippone, Pietro Michiardi

We revisit the theoretical properties of Hamiltonian stochastic differential equations (SDES) for Bayesian posterior sampling, and we study the two types of errors that arise from numerical SDE simulation: the discretization error and the error due to noisy gradient estimates in the context of data subsampling.

Numerical Integration

Model Selection for Bayesian Autoencoders

no code implementations NeurIPS 2021 Ba-Hien Tran, Simone Rossi, Dimitrios Milios, Pietro Michiardi, Edwin V. Bonilla, Maurizio Filippone

We develop a novel method for carrying out model selection for Bayesian autoencoders (BAEs) by means of prior hyper-parameter optimization.

Model Selection Representation Learning

All You Need is a Good Functional Prior for Bayesian Deep Learning

no code implementations25 Nov 2020 Ba-Hien Tran, Simone Rossi, Dimitrios Milios, Maurizio Filippone

This poses a challenge because modern neural networks are characterized by a large number of parameters, and the choice of these priors has an uncontrolled effect on the induced functional prior, which is the distribution of the functions obtained by sampling the parameters from their prior distribution.

Gaussian Processes

Parametric Bootstrap Ensembles as Variational Inference

no code implementations pproximateinference AABI Symposium 2021 Dimitrios Milios, Pietro Michiardi, Maurizio Filippone

In this paper, we employ variational arguments to establish a connection between ensemble methods for Neural Networks and Bayesian inference.

Bayesian Inference Variational Inference

Sparse within Sparse Gaussian Processes using Neighbor Information

no code implementations10 Nov 2020 Gia-Lac Tran, Dimitrios Milios, Pietro Michiardi, Maurizio Filippone

In this work, we address one limitation of sparse GPs, which is due to the challenge in dealing with a large number of inducing variables without imposing a special structure on the inducing inputs.

Gaussian Processes Variational Inference

An Identifiable Double VAE For Disentangled Representations

no code implementations19 Oct 2020 Graziano Mita, Maurizio Filippone, Pietro Michiardi

A large part of the literature on learning disentangled representations focuses on variational autoencoders (VAE).


Isotropic SGD: a Practical Approach to Bayesian Posterior Sampling

no code implementations9 Jun 2020 Giulio Franzese, Rosa Candela, Dimitrios Milios, Maurizio Filippone, Pietro Michiardi

In this work we define a unified mathematical framework to deepen our understanding of the role of stochastic gradient (SG) noise on the behavior of Markov chain Monte Carlo sampling (SGMCMC) algorithms.

A Variational View on Bootstrap Ensembles as Bayesian Inference

no code implementations8 Jun 2020 Dimitrios Milios, Pietro Michiardi, Maurizio Filippone

In this paper, we employ variational arguments to establish a connection between ensemble methods for Neural Networks and Bayesian inference.

Bayesian Inference

Model Monitoring and Dynamic Model Selection in Travel Time-series Forecasting

2 code implementations16 Mar 2020 Rosa Candela, Pietro Michiardi, Maurizio Filippone, Maria A. Zuluaga

Accurate travel products price forecasting is a highly desired feature that allows customers to take informed decisions about purchases, and companies to build and offer attractive tour packages.


Sparse Gaussian Processes Revisited: Bayesian Approaches to Inducing-Variable Approximations

no code implementations6 Mar 2020 Simone Rossi, Markus Heinonen, Edwin V. Bonilla, Zheyang Shen, Maurizio Filippone

Variational inference techniques based on inducing variables provide an elegant framework for scalable posterior estimation in Gaussian process (GP) models.

Gaussian Processes Variational Inference

Efficient Approximate Inference with Walsh-Hadamard Variational Inference

no code implementations29 Nov 2019 Simone Rossi, Sebastien Marmin, Maurizio Filippone

Variational inference offers scalable and flexible tools to tackle intractable Bayesian inference of modern statistical models like Bayesian neural networks and Gaussian processes.

Bayesian Inference Gaussian Processes +1

LIBRE: Learning Interpretable Boolean Rule Ensembles

no code implementations15 Nov 2019 Graziano Mita, Paolo Papotti, Maurizio Filippone, Pietro Michiardi

We present a novel method - LIBRE - to learn an interpretable classifier, which materializes as a set of Boolean rules.

Sparsification as a Remedy for Staleness in Distributed Asynchronous SGD

no code implementations21 Oct 2019 Rosa Candela, Giulio Franzese, Maurizio Filippone, Pietro Michiardi

Large scale machine learning is increasingly relying on distributed optimization, whereby several machines contribute to the training process of a statistical model.

Distributed Optimization

Deep Compositional Spatial Models

no code implementations6 Jun 2019 Andrew Zammit-Mangion, Tin Lok James Ng, Quan Vu, Maurizio Filippone

Spatial processes with nonstationary and anisotropic covariance structure are often used when modelling, analysing and predicting complex environmental phenomena.

Gaussian Processes

Walsh-Hadamard Variational Inference for Bayesian Deep Learning

no code implementations NeurIPS 2020 Simone Rossi, Sebastien Marmin, Maurizio Filippone

Over-parameterized models, such as DeepNets and ConvNets, form a class of models that are routinely adopted in a wide variety of applications, and for which Bayesian inference is desirable but extremely challenging.

Bayesian Inference Variational Inference

A comparative evaluation of novelty detection algorithms for discrete sequences

no code implementations28 Feb 2019 Rémi Domingues, Pietro Michiardi, Jérémie Barlet, Maurizio Filippone

The identification of anomalies in temporal data is a core component of numerous research areas such as intrusion detection, fault prevention, genomics and fraud detection.

Fraud Detection Intrusion Detection

Variational Calibration of Computer Models

no code implementations29 Oct 2018 Sébastien Marmin, Maurizio Filippone

Bayesian calibration of black-box computer models offers an established framework to obtain a posterior distribution over model parameters.

Gaussian Processes Variational Inference

Good Initializations of Variational Bayes for Deep Models

no code implementations18 Oct 2018 Simone Rossi, Pietro Michiardi, Maurizio Filippone

Stochastic variational inference is an established way to carry out approximate Bayesian inference for deep models.

Bayesian Inference General Classification +1

Calibrating Deep Convolutional Gaussian Processes

1 code implementation26 May 2018 Gia-Lac Tran, Edwin V. Bonilla, John P. Cunningham, Pietro Michiardi, Maurizio Filippone

The wide adoption of Convolutional Neural Networks (CNNs) in applications where decision-making under uncertainty is fundamental, has brought a great deal of attention to the ability of these models to accurately quantify the uncertainty in their predictions.

Decision Making Decision Making Under Uncertainty +3

Constraining the Dynamics of Deep Probabilistic Models

no code implementations ICML 2018 Marco Lorenzi, Maurizio Filippone

We introduce a novel generative formulation of deep probabilistic models implementing "soft" constraints on their function dynamics.

Variational Inference

Pseudo-extended Markov chain Monte Carlo

1 code implementation NeurIPS 2019 Christopher Nemeth, Fredrik Lindsten, Maurizio Filippone, James Hensman

In this paper, we introduce the pseudo-extended MCMC method as a simple approach for improving the mixing of the MCMC sampler for multi-modal posterior distributions.

Entropic Trace Estimates for Log Determinants

1 code implementation24 Apr 2017 Jack Fitzsimons, Diego Granziol, Kurt Cutajar, Michael Osborne, Maurizio Filippone, Stephen Roberts

The scalable calculation of matrix determinants has been a bottleneck to the widespread application of many machine learning methods such as determinantal point processes, Gaussian processes, generalised Markov random fields, graph models and many others.

Gaussian Processes Point Processes

Bayesian Inference of Log Determinants

no code implementations5 Apr 2017 Jack Fitzsimons, Kurt Cutajar, Michael Osborne, Stephen Roberts, Maurizio Filippone

The log-determinant of a kernel matrix appears in a variety of machine learning problems, ranging from determinantal point processes and generalized Markov random fields, through to the training of Gaussian processes.

Bayesian Inference Gaussian Processes +1

AutoGP: Exploring the Capabilities and Limitations of Gaussian Process Models

no code implementations18 Oct 2016 Karl Krauth, Edwin V. Bonilla, Kurt Cutajar, Maurizio Filippone

We investigate the capabilities and limitations of Gaussian process models by jointly exploring three complementary directions: (i) scalable and statistically efficient inference; (ii) flexible kernels; and (iii) objective functions for hyperparameter learning alternative to the marginal likelihood.

General Classification

Random Feature Expansions for Deep Gaussian Processes

1 code implementation ICML 2017 Kurt Cutajar, Edwin V. Bonilla, Pietro Michiardi, Maurizio Filippone

The composition of multiple Gaussian Processes as a Deep Gaussian Process (DGP) enables a deep probabilistic nonparametric approach to flexibly tackle complex machine learning problems with sound quantification of uncertainty.

Gaussian Processes Variational Inference

Mini-Batch Spectral Clustering

no code implementations7 Jul 2016 Yufei Han, Maurizio Filippone

The cost of computing the spectrum of Laplacian matrices hinders the application of spectral clustering to large data sets.

Preconditioning Kernel Matrices

1 code implementation22 Feb 2016 Kurt Cutajar, Michael A. Osborne, John P. Cunningham, Maurizio Filippone

Preconditioning is a common approach to alleviating this issue.

MCMC for Variationally Sparse Gaussian Processes

no code implementations NeurIPS 2015 James Hensman, Alexander G. de G. Matthews, Maurizio Filippone, Zoubin Ghahramani

This paper simultaneously addresses these, using a variational approximation to the posterior which is sparse in support of the function but otherwise free-form.

Gaussian Processes

Enabling scalable stochastic gradient-based inference for Gaussian processes by employing the Unbiased LInear System SolvEr (ULISSE)

no code implementations22 Jan 2015 Maurizio Filippone, Raphael Engler

In applications of Gaussian processes where quantification of uncertainty is of primary interest, it is necessary to accurately characterize the posterior distribution over covariance parameters.

Gaussian Processes

Bayesian Inference for Gaussian Process Classifiers with Annealing and Pseudo-Marginal MCMC

no code implementations28 Nov 2013 Maurizio Filippone

The results empirically demonstrate that compared to importance sampling, annealed importance sampling can reduce the variance of the estimate of the marginal likelihood exponentially in the number of data at a computational cost that scales only polynomially.

Bayesian Inference

Pseudo-Marginal Bayesian Inference for Gaussian Processes

no code implementations2 Oct 2013 Maurizio Filippone, Mark Girolami

The main challenges that arise when adopting Gaussian Process priors in probabilistic modeling are how to carry out exact Bayesian inference and how to account for uncertainty on model parameters when making model-based predictions on out-of-sample data.

Bayesian Inference Gaussian Processes

Cannot find the paper you are looking for? You can Submit a new open access paper.