Search Results for author: Víctor Elvira

Found 13 papers, 3 papers with code

Efficient Mixture Learning in Black-Box Variational Inference

no code implementations11 Jun 2024 Alexandra Hotti, Oskar Kviman, Ricky Molén, Víctor Elvira, Jens Lagergren

However, currently scaling the number of mixture components can lead to a linear increase in the number of learnable parameters and a quadratic increase in inference time due to the evaluation of the evidence lower bound (ELBO).

Adaptive importance sampling for heavy-tailed distributions via $α$-divergence minimization

1 code implementation25 Oct 2023 Thomas Guilmeau, Nicola Branchini, Emilie Chouzenoux, Víctor Elvira

We then show that the $\alpha$-divergence can be approximated by a generalized notion of effective sample size and leverage this new perspective to adapt the tail parameter with Bayesian optimization.

Bayesian Optimization Variational Inference

Differentiable Bootstrap Particle Filters for Regime-Switching Models

no code implementations20 Feb 2023 Wenhan Li, Xiongjie Chen, Wenwu Wang, Víctor Elvira, Yunpeng Li

Differentiable particle filters are an emerging class of particle filtering methods that use neural networks to construct and learn parametric state-space models.

Cooperation in the Latent Space: The Benefits of Adding Mixture Components in Variational Autoencoders

1 code implementation30 Sep 2022 Oskar Kviman, Ricky Molén, Alexandra Hotti, Semih Kurt, Víctor Elvira, Jens Lagergren

In this work, we also demonstrate that increasing the number of mixture components improves the latent-representation capabilities of the VAE on both image and single-cell datasets.

Variational Inference

Hamiltonian Adaptive Importance Sampling

no code implementations27 Sep 2022 Ali Mousavi, Reza Monsefi, Víctor Elvira

Importance sampling (IS) is a powerful Monte Carlo (MC) methodology for approximating integrals, for instance in the context of Bayesian inference.

Bayesian Inference

Multiple Importance Sampling ELBO and Deep Ensembles of Variational Approximations

1 code implementation22 Feb 2022 Oskar Kviman, Harald Melin, Hazal Koptagel, Víctor Elvira, Jens Lagergren

In variational inference (VI), the marginal log-likelihood is estimated using the standard evidence lower bound (ELBO), or improved versions as the importance weighted ELBO (IWELBO).

Density Estimation Variational Inference

Compressed Monte Carlo with application in particle filtering

no code implementations18 Jul 2021 Luca Martino, Víctor Elvira

In its basic version, C-MC is strictly related to the stratification technique, a well-known method used for variance reduction purposes.

Bayesian Inference

Optimized Auxiliary Particle Filters: adapting mixture proposals via convex optimization

no code implementations18 Nov 2020 Nicola Branchini, Víctor Elvira

In this work, we propose optimized auxiliary particle filters, a framework where the traditional APF auxiliary variables are interpreted as weights in an importance sampling mixture proposal.


A probabilistic incremental proximal gradient method

no code implementations4 Dec 2018 Ömer Deniz Akyildiz, Émilie Chouzenoux, Víctor Elvira, Joaquín Míguez

In this paper, we propose a probabilistic optimization method, named probabilistic incremental proximal gradient (PIPG) method, by developing a probabilistic interpretation of the incremental proximal gradient algorithm.

On the Relationship between Online Gaussian Process Regression and Kernel Least Mean Squares Algorithms

no code implementations11 Sep 2016 Steven Van Vaerenbergh, Jesus Fernandez-Bes, Víctor Elvira

We study the relationship between online Gaussian process (GP) regression and kernel least mean squares (KLMS) algorithms.


Cannot find the paper you are looking for? You can Submit a new open access paper.