Search Results for author: Michael Riis Andersen

Found 22 papers, 7 papers with code

Neural machine translation for automated feedback on children's early-stage writing

no code implementations15 Nov 2023 Jonas Vestergaard Jensen, Mikkel Jordahn, Michael Riis Andersen

In this work, we address the problem of assessing and constructing feedback for early-stage writing automatically using machine learning.

Machine Translation Translation

Polygonizer: An auto-regressive building delineator

no code implementations8 Apr 2023 Maxim Khomiakov, Michael Riis Andersen, Jes Frellsen

In geospatial planning, it is often essential to represent objects in a vectorized format, as this format easily translates to downstream tasks such as web development, graphics, or design.

Semantic Segmentation

Learning to Generate 3D Representations of Building Roofs Using Single-View Aerial Imagery

no code implementations20 Mar 2023 Maxim Khomiakov, Alejandro Valverde Mahou, Alba Reinders Sánchez, Jes Frellsen, Michael Riis Andersen

We present a novel pipeline for learning the conditional distribution of a building roof mesh given pixels from an aerial image, under the assumption that roof geometry follows a set of regular patterns.

On the role of Model Uncertainties in Bayesian Optimization

no code implementations14 Jan 2023 Jonathan Foldager, Mikkel Jordahn, Lars Kai Hansen, Michael Riis Andersen

In this work, we provide an extensive study of the relationship between the BO performance (regret) and uncertainty calibration for popular surrogate models and compare them across both synthetic and real-world experiments.

Bayesian Optimization Decision Making +1

Robust, Automated, and Accurate Black-box Variational Inference

1 code implementation29 Mar 2022 Manushi Welandawe, Michael Riis Andersen, Aki Vehtari, Jonathan H. Huggins

RAABBVI adaptively decreases the learning rate by detecting convergence of the fixed--learning-rate iterates, then estimates the symmetrized Kullback--Leiber (KL) divergence between the current variational approximation and the optimal one.

Bayesian Inference Stochastic Optimization +1

Challenges for BBVI with Normalizing Flows

no code implementations ICML Workshop INNF 2021 Akash Kumar Dhaka, Alejandro Catalina, Manushi Welandawe, Michael Riis Andersen, Jonathan H. Huggins, Aki Vehtari

Current black-box variational inference (BBVI) methods require the user to make numerous design choices---such as the selection of variational objective and approximating family---yet there is little principled guidance on how to do so.

Variational Inference

Challenges and Opportunities in High Dimensional Variational Inference

no code implementations NeurIPS 2021 Akash Kumar Dhaka, Alejandro Catalina, Manushi Welandawe, Michael Riis Andersen, Jonathan H. Huggins, Aki Vehtari

Our framework and supporting experiments help to distinguish between the behavior of BBVI methods for approximating low-dimensional versus moderate-to-high-dimensional posteriors.

Variational Inference Vocal Bursts Intensity Prediction

Challenges and Opportunities in High-dimensional Variational Inference

no code implementations NeurIPS 2021 Akash Kumar Dhaka, Alejandro Catalina, Manushi Welandawe, Michael Riis Andersen, Jonathan Huggins, Aki Vehtari

Our framework and supporting experiments help to distinguish between the behavior of BBVI methods for approximating low-dimensional versus moderate-to-high-dimensional posteriors.

Variational Inference Vocal Bursts Intensity Prediction

State Space Expectation Propagation: Efficient Inference Schemes for Temporal Gaussian Processes

1 code implementation ICML 2020 William J. Wilkinson, Paul E. Chang, Michael Riis Andersen, Arno Solin

EP provides some benefits over the traditional methods via introduction of the so-called cavity distribution, and we combine these benefits with the computational efficiency of linearisation, providing extensive empirical analysis demonstrating the efficacy of various algorithms under this unifying framework.

Bayesian Inference Computational Efficiency +2

Uncertainty-aware Sensitivity Analysis Using Rényi Divergences

1 code implementation17 Oct 2019 Topi Paananen, Michael Riis Andersen, Aki Vehtari

For nonlinear supervised learning models, assessing the importance of predictor variables or their interactions is not straightforward because it can vary in the domain of the variables.

Global Approximate Inference via Local Linearisation for Temporal Gaussian Processes

no code implementations pproximateinference AABI Symposium 2019 William J. Wilkinson, Paul E. Chang, Michael Riis Andersen, Arno Solin

The extended Kalman filter (EKF) is a classical signal processing algorithm which performs efficient approximate Bayesian inference in non-conjugate models by linearising the local measurement function, avoiding the need to compute intractable integrals when calculating the posterior.

Bayesian Inference Gaussian Processes +1

Bayesian leave-one-out cross-validation for large data

no code implementations24 Apr 2019 Måns Magnusson, Michael Riis Andersen, Johan Jonasson, Aki Vehtari

Model inference, such as model comparison, model checking, and model selection, is an important part of model development.

Model Selection

End-to-End Probabilistic Inference for Nonstationary Audio Analysis

1 code implementation31 Jan 2019 William J. Wilkinson, Michael Riis Andersen, Joshua D. Reiss, Dan Stowell, Arno Solin

A typical audio signal processing pipeline includes multiple disjoint analysis stages, including calculation of a time-frequency representation followed by spectrogram-based feature analysis.

Audio Signal Processing regression

Unifying Probabilistic Models for Time-Frequency Analysis

1 code implementation6 Nov 2018 William J. Wilkinson, Michael Riis Andersen, Joshua D. Reiss, Dan Stowell, Arno Solin

In audio signal processing, probabilistic time-frequency models have many benefits over their non-probabilistic counterparts.

Audio Signal Processing Gaussian Processes +1

Variable selection for Gaussian processes via sensitivity analysis of the posterior predictive distribution

2 code implementations21 Dec 2017 Topi Paananen, Juho Piironen, Michael Riis Andersen, Aki Vehtari

Variable selection for Gaussian process models is often done using automatic relevance determination, which uses the inverse length-scale parameter of each input variable as a proxy for variable relevance.

Gaussian Processes Variable Selection

EEG source imaging assists decoding in a face recognition task

no code implementations17 Apr 2017 Rasmus S. Andersen, Anders U. Eliasen, Nicolai Pedersen, Michael Riis Andersen, Sofie Therese Hansen, Lars Kai Hansen

In this work we explore the generality of Edelman et al. hypothesis by considering decoding of face recognition.

Neurons and Cognition

Correcting boundary over-exploration deficiencies in Bayesian optimization with virtual derivative sign observations

1 code implementation4 Apr 2017 Eero Siivola, Aki Vehtari, Jarno Vanhatalo, Javier González, Michael Riis Andersen

Bayesian optimization (BO) is a global optimization strategy designed to find the minimum of an expensive black-box function, typically defined on a compact subset of $\mathcal{R}^d$, by using a Gaussian process (GP) as a surrogate model for the objective.

Bayesian Optimization

Bayesian inference for spatio-temporal spike-and-slab priors

no code implementations15 Sep 2015 Michael Riis Andersen, Aki Vehtari, Ole Winther, Lars Kai Hansen

In this work, we address the problem of solving a series of underdetermined linear inverse problems subject to a sparsity constraint.

Bayesian Inference

Spatio-temporal Spike and Slab Priors for Multiple Measurement Vector Problems

no code implementations19 Aug 2015 Michael Riis Andersen, Ole Winther, Lars Kai Hansen

We are interested in solving the multiple measurement vector (MMV) problem for instances, where the underlying sparsity pattern exhibit spatio-temporal structure motivated by the electroencephalogram (EEG) source localization problem.

EEG

Cannot find the paper you are looking for? You can Submit a new open access paper.