Search Results for author: Luigi Acerbi

Found 14 papers, 12 papers with code

PyBADS: Fast and robust black-box optimization in Python

1 code implementation27 Jun 2023 Gurjeet Sangra Singh, Luigi Acerbi

PyBADS is a Python implementation of the Bayesian Adaptive Direct Search (BADS) algorithm for fast and robust black-box optimization (Acerbi and Ma 2017).

Input-gradient space particle inference for neural network ensembles

1 code implementation5 Jun 2023 Trung Trinh, Markus Heinonen, Luigi Acerbi, Samuel Kaski

To sidestep these difficulties, we propose First-order Repulsive Deep Ensemble (FoRDE), an ensemble learning method based on ParVI, which performs repulsion in the space of first-order input gradients.

Ensemble Learning Image Classification +2

Learning Robust Statistics for Simulation-based Inference under Model Misspecification

1 code implementation NeurIPS 2023 Daolang Huang, Ayush Bharti, Amauri Souza, Luigi Acerbi, Samuel Kaski

Simulation-based inference (SBI) methods such as approximate Bayesian computation (ABC), synthetic likelihood, and neural posterior estimation (NPE) rely on simulating statistics to infer parameters of intractable likelihood models.

Time Series

PyVBMC: Efficient Bayesian inference in Python

1 code implementation16 Mar 2023 Bobby Huggins, Chengkun Li, Marlon Tobaben, Mikko J. Aarnos, Luigi Acerbi

PyVBMC is a Python implementation of the Variational Bayesian Monte Carlo (VBMC) algorithm for posterior and model inference for black-box computational models (Acerbi, 2018, 2020).

Bayesian Inference Model Selection

Fast post-process Bayesian inference with Sparse Variational Bayesian Monte Carlo

no code implementations9 Mar 2023 Chengkun Li, Grégoire Clarté, Luigi Acerbi

First, we make VBMC scalable to a large number of pre-existing evaluations via sparse GP regression, deriving novel Bayesian quadrature formulae and acquisition functions for active learning with sparse GPs.

Active Learning Bayesian Inference

Online simulator-based experimental design for cognitive model selection

1 code implementation3 Mar 2023 Alexander Aushev, Aini Putkonen, Gregoire Clarte, Suyog Chandramouli, Luigi Acerbi, Samuel Kaski, Andrew Howes

In this paper, we propose BOSMOS: an approach to experimental design that can select between computational models without tractable likelihoods.

Experimental Design Model Selection

Tackling covariate shift with node-based Bayesian neural networks

1 code implementation6 Jun 2022 Trung Trinh, Markus Heinonen, Luigi Acerbi, Samuel Kaski

In this paper, we interpret these latent noise variables as implicit representations of simple and domain-agnostic data perturbations during training, producing BNNs that perform well under covariate shift due to input corruptions.

Image Classification

Parallel MCMC Without Embarrassing Failures

1 code implementation22 Feb 2022 Daniel Augusto de Souza, Diego Mesquita, Samuel Kaski, Luigi Acerbi

While efficient, this framework is very sensitive to the quality of subposterior sampling.

Active Learning Bayesian Inference

Dynamic allocation of limited memory resources in reinforcement learning

1 code implementation NeurIPS 2020 Nisheet Patel, Luigi Acerbi, Alexandre Pouget

We derive from first principles an algorithm, Dynamic Resource Allocator (DRA), which we apply to two standard tasks in reinforcement learning and a model-based planning task, and find that it allocates more resources to items in memory that have a higher impact on cumulative rewards.

Neurons and Cognition

Variational Bayesian Monte Carlo with Noisy Likelihoods

2 code implementations NeurIPS 2020 Luigi Acerbi

Variational Bayesian Monte Carlo (VBMC) is a recently introduced framework that uses Gaussian process surrogates to perform approximate Bayesian inference in models with black-box, non-cheap likelihoods.

Bayesian Inference

Unbiased and Efficient Log-Likelihood Estimation with Inverse Binomial Sampling

2 code implementations12 Jan 2020 Bas van Opheusden, Luigi Acerbi, Wei Ji Ma

We provide theoretical arguments in favor of IBS and an empirical assessment of the method for maximum-likelihood estimation with simulation-based models.

Variational Bayesian Monte Carlo

4 code implementations NeurIPS 2018 Luigi Acerbi

We introduce here a novel sample-efficient inference framework, Variational Bayesian Monte Carlo (VBMC).

Bayesian Inference Model Selection +1

Practical Bayesian Optimization for Model Fitting with Bayesian Adaptive Direct Search

4 code implementations NeurIPS 2017 Luigi Acerbi, Wei Ji Ma

Computational models in fields such as computational neuroscience are often evaluated via stochastic simulation or numerical approximation.

Bayesian Optimization

A Framework for Testing Identifiability of Bayesian Models of Perception

no code implementations NeurIPS 2014 Luigi Acerbi, Wei Ji Ma, Sethu Vijayakumar

Bayesian observer models are very effective in describing human performance in perceptual tasks, so much so that they are trusted to faithfully recover hidden mental representations of priors, likelihoods, or loss functions from the data.

Experimental Design

Cannot find the paper you are looking for? You can Submit a new open access paper.