Search Results for author: Fredrik Lindsten

Found 44 papers, 17 papers with code

On the connection between Noise-Contrastive Estimation and Contrastive Divergence

no code implementations26 Feb 2024 Amanda Olmin, Jakob Lindqvist, Lennart Svensson, Fredrik Lindsten

Noise-contrastive estimation (NCE) is a popular method for estimating unnormalised probabilistic models, such as energy-based models, which are effective for modelling complex data distributions.

Discriminator Guidance for Autoregressive Diffusion Models

no code implementations24 Oct 2023 Filip Ekström Kelvinius, Fredrik Lindsten

The use of a discriminator to guide a diffusion process has previously been used for continuous diffusion models, and in this work we derive ways of using a discriminator together with a pretrained generative model in the discrete case.

Graph-based Neural Weather Prediction for Limited Area Modeling

1 code implementation29 Sep 2023 Joel Oskarsson, Tomas Landelius, Fredrik Lindsten

The rise of accurate machine learning methods for weather forecasting is creating radical new possibilities for modeling the atmosphere.

Weather Forecasting

Temporal Graph Neural Networks for Irregular Data

1 code implementation16 Feb 2023 Joel Oskarsson, Per Sidén, Fredrik Lindsten

Our TGNN4I model is designed to handle both irregular time steps and partial observations of the graph.

Time Series Time Series Analysis

Calibration tests beyond classification

1 code implementation ICLR 2021 David Widmann, Fredrik Lindsten, Dave Zachariah

In the machine learning literature, different measures and statistical tests have been proposed and studied for evaluating the calibration of classification models.

Classification Decision Making +3

A Variational Perspective on Generative Flow Networks

no code implementations14 Oct 2022 Heiko Zimmermann, Fredrik Lindsten, Jan-Willem van de Meent, Christian A. Naesseth

Generative flow networks (GFNs) are a class of models for sequential sampling of composite objects, which approximate a target distribution that is defined in terms of an energy function or a reward.

Variational Inference

Marginalized particle Gibbs for multiple state-space models coupled through shared parameters

no code implementations13 Oct 2022 Anna Wigren, Fredrik Lindsten

We provide insights on when each sampler should be used and show that they can be combined to form an efficient PG sampler for a model with strong dependencies between states and parameters.

Bayesian Inference Time Series +1

Scalable Deep Gaussian Markov Random Fields for General Graphs

1 code implementation10 Jun 2022 Joel Oskarsson, Per Sidén, Fredrik Lindsten

We propose a flexible GMRF model for general graphs built on the multi-layer structure of Deep GMRFs, originally proposed for lattice graphs only.

Bayesian Inference Variational Inference

Robustness and Reliability When Training With Noisy Labels

no code implementations7 Oct 2021 Amanda Olmin, Fredrik Lindsten

We find that strictly proper and robust loss functions both offer asymptotic robustness in accuracy, but neither guarantee that the final model is calibrated.

Uncertainty Quantification

Markovian Score Climbing: Variational Inference with KL(p||q)

no code implementations NeurIPS 2020 Christian A. Naesseth, Fredrik Lindsten, David Blei

Modern variational inference (VI) uses stochastic gradients to avoid intractable expectations, enabling large-scale probabilistic inference in complex models.

Variational Inference

A general framework for ensemble distribution distillation

1 code implementation26 Feb 2020 Jakob Lindqvist, Amanda Olmin, Fredrik Lindsten, Lennart Svensson

Ensembles of neural networks have been shown to give better performance than single networks, both in terms of predictions and uncertainty estimation.

regression

Deep Gaussian Markov Random Fields

1 code implementation ICML 2020 Per Sidén, Fredrik Lindsten

Gaussian Markov random fields (GMRFs) are probabilistic graphical models widely used in spatial statistics and related fields to model dependencies over spatial structures.

Variational Inference

Calibration tests in multi-class classification: A unifying framework

1 code implementation NeurIPS 2019 David Widmann, Fredrik Lindsten, Dave Zachariah

In safety-critical applications a probabilistic model is usually required to be calibrated, i. e., to capture the uncertainty of its predictions accurately.

Classification General Classification +1

Particle filter with rejection control and unbiased estimator of the marginal likelihood

no code implementations21 Oct 2019 Jan Kudlicka, Lawrence M. Murray, Thomas B. Schön, Fredrik Lindsten

While the variance reducing properties of rejection control are known, there has not been (to the best of our knowledge) any work on unbiased estimation of the marginal likelihood (also known as the model evidence or the normalizing constant) in this type of particle filter.

Matrix Multilayer Perceptron

no code implementations25 Sep 2019 Jalil Taghia, Maria Bånkestad, Fredrik Lindsten, Thomas Schön

Models that output a vector of responses given some inputs, in the form of a conditional mean vector, are at the core of machine learning.

Elements of Sequential Monte Carlo

no code implementations12 Mar 2019 Christian A. Naesseth, Fredrik Lindsten, Thomas B. Schön

A core problem in statistics and probabilistic machine learning is to compute probability distributions and expectations.

Bayesian Inference BIG-bench Machine Learning +1

Evaluating model calibration in classification

1 code implementation19 Feb 2019 Juozas Vaicenavicius, David Widmann, Carl Andersson, Fredrik Lindsten, Jacob Roll, Thomas B. Schön

Probabilistic classifiers output a probability distribution on target classes rather than just a class prediction.

Classification Decision Making +1

Constructing the Matrix Multilayer Perceptron and its Application to the VAE

no code implementations4 Feb 2019 Jalil Taghia, Maria Bånkestad, Fredrik Lindsten, Thomas B. Schön

However, in certain scenarios we are interested in learning structured parameters (predictions) in the form of symmetric positive definite matrices.

Graphical model inference: Sequential Monte Carlo meets deterministic approximations

2 code implementations NeurIPS 2018 Fredrik Lindsten, Jouni Helske, Matti Vihola

Approximate inference in probabilistic graphical models (PGMs) can be grouped into deterministic methods and Monte-Carlo-based methods.

Learning dynamical systems with particle stochastic approximation EM

no code implementations25 Jun 2018 Andreas Lindholm, Fredrik Lindsten

By combining stochastic approximation EM and particle Gibbs with ancestor sampling (PGAS), PSAEM obtains superior computational performance and convergence properties compared to plain particle-smoothing-based approximations of the EM algorithm.

Pseudo-extended Markov chain Monte Carlo

1 code implementation NeurIPS 2019 Christopher Nemeth, Fredrik Lindsten, Maurizio Filippone, James Hensman

In this paper, we introduce the pseudo-extended MCMC method as a simple approach for improving the mixing of the MCMC sampler for multi-modal posterior distributions.

Probabilistic learning of nonlinear dynamical systems using sequential Monte Carlo

no code implementations7 Mar 2017 Thomas B. Schön, Andreas Svensson, Lawrence Murray, Fredrik Lindsten

We are concerned with the problem of learning probabilistic models of dynamical systems from measured data.

Learning of state-space models with highly informative observations: a tempered Sequential Monte Carlo solution

no code implementations6 Feb 2017 Andreas Svensson, Thomas B. Schön, Fredrik Lindsten

In particular, for learning of unknown parameters in nonlinear state-space models, methods based on the particle filter (a Monte Carlo method) have proven very useful.

Smoothing with Couplings of Conditional Particle Filters

1 code implementation8 Jan 2017 Pierre E. Jacob, Fredrik Lindsten, Thomas B. Schön

The method combines two recent breakthroughs: the first is a generic debiasing technique for Markov chains due to Rhee and Glynn, and the second is the introduction of a uniformly ergodic Markov chain for smoothing, the conditional particle filter of Andrieu, Doucet and Holenstein.

Methodology Computation

High-dimensional Filtering using Nested Sequential Monte Carlo

no code implementations29 Dec 2016 Christian A. Naesseth, Fredrik Lindsten, Thomas B. Schön

Sequential Monte Carlo (SMC) methods comprise one of the most successful approaches to approximate Bayesian filtering.

Vocal Bursts Intensity Prediction

Pseudo-Marginal Hamiltonian Monte Carlo

no code implementations8 Jul 2016 Johan Alenlöv, Arnaud Doucet, Fredrik Lindsten

When following a Markov chain Monte Carlo (MCMC) approach to approximate the posterior distribution in this context, one typically either uses MCMC schemes which target the joint posterior of the parameters and some auxiliary latent variables, or pseudo-marginal Metropolis--Hastings (MH) schemes.

Bayesian Inference

Interacting Particle Markov Chain Monte Carlo

1 code implementation16 Feb 2016 Tom Rainforth, Christian A. Naesseth, Fredrik Lindsten, Brooks Paige, Jan-Willem van de Meent, Arnaud Doucet, Frank Wood

We introduce interacting particle Markov chain Monte Carlo (iPMCMC), a PMCMC method based on an interacting pool of standard and conditional sequential Monte Carlo samplers.

Accelerating pseudo-marginal Metropolis-Hastings by correlating auxiliary variables

no code implementations17 Nov 2015 Johan Dahlin, Fredrik Lindsten, Joel Kronander, Thomas B. Schön

Pseudo-marginal Metropolis-Hastings (pmMH) is a powerful method for Bayesian inference in models where the posterior distribution is analytical intractable or computationally costly to evaluate directly.

Bayesian Inference

Sequential Monte Carlo Methods for System Identification

no code implementations20 Mar 2015 Thomas B. Schön, Fredrik Lindsten, Johan Dahlin, Johan Wågberg, Christian A. Naesseth, Andreas Svensson, Liang Dai

One of the key challenges in identifying nonlinear and possibly non-Gaussian state space models (SSMs) is the intractability of estimating the system state.

Quasi-Newton particle Metropolis-Hastings

1 code implementation12 Feb 2015 Johan Dahlin, Fredrik Lindsten, Thomas B. Schön

A possible application is parameter inference in the challenging class of SSMs with intractable likelihoods.

Nested Sequential Monte Carlo Methods

1 code implementation9 Feb 2015 Christian A. Naesseth, Fredrik Lindsten, Thomas B. Schön

NSMC generalises the SMC framework by requiring only approximate, properly weighted, samples from the SMC proposal distribution, while still resulting in a correct SMC algorithm.

Sequential Kernel Herding: Frank-Wolfe Optimization for Particle Filtering

no code implementations9 Jan 2015 Simon Lacoste-Julien, Fredrik Lindsten, Francis Bach

Recently, the Frank-Wolfe optimization algorithm was suggested as a procedure to obtain adaptive quadrature rules for integrals of functions in a reproducing kernel Hilbert space (RKHS) with a potentially faster rate of convergence than Monte Carlo integration (and "kernel herding" was shown to be a special case of this procedure).

Position

Identification of jump Markov linear models using particle filters

no code implementations25 Sep 2014 Andreas Svensson, Thomas B. Schön, Fredrik Lindsten

Jump Markov linear models consists of a finite number of linear state space models and a discrete variable encoding the jumps (or switches) between the different linear models.

Divide-and-Conquer with Sequential Monte Carlo

3 code implementations19 Jun 2014 Fredrik Lindsten, Adam M. Johansen, Christian A. Naesseth, Bonnie Kirkpatrick, Thomas B. Schön, John Aston, Alexandre Bouchard-Côté

We propose a novel class of Sequential Monte Carlo (SMC) algorithms, appropriate for inference in probabilistic graphical models.

Sequential Monte Carlo for Graphical Models

no code implementations NeurIPS 2014 Christian A. Naesseth, Fredrik Lindsten, Thomas B. Schön

We propose a new framework for how to use sequential Monte Carlo (SMC) algorithms for inference in probabilistic graphical models (PGM).

Particle Gibbs with Ancestor Sampling

no code implementations3 Jan 2014 Fredrik Lindsten, Michael. I. Jordan, Thomas B. Schön

Particle Markov chain Monte Carlo (PMCMC) is a systematic way of combining the two main tools used for Monte Carlo statistical inference: sequential Monte Carlo (SMC) and Markov chain Monte Carlo (MCMC).

Identification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM

no code implementations17 Dec 2013 Roger Frigola, Fredrik Lindsten, Thomas B. Schön, Carl E. Rasmussen

Gaussian process state-space models (GP-SSMs) are a very flexible family of models of nonlinear dynamical systems.

Particle Metropolis-Hastings using gradient and Hessian information

no code implementations4 Nov 2013 Johan Dahlin, Fredrik Lindsten, Thomas B. Schön

Particle Metropolis-Hastings (PMH) allows for Bayesian parameter inference in nonlinear state space models by combining Markov chain Monte Carlo (MCMC) and particle filtering.

Particle filter-based Gaussian process optimisation for parameter inference

no code implementations4 Nov 2013 Johan Dahlin, Fredrik Lindsten

Finally, we use a heuristic procedure to obtain a revised parameter iterate, providing an automatic trade-off between exploration and exploitation of the surrogate model.

Ancestor Sampling for Particle Gibbs

no code implementations NeurIPS 2012 Fredrik Lindsten, Thomas Schön, Michael. I. Jordan

We present a novel method in the family of particle MCMC methods that we refer to as particle Gibbs with ancestor sampling (PG-AS).

Cannot find the paper you are looking for? You can Submit a new open access paper.