Search Results for author: Pasi Jylänki

Found 6 papers, 2 papers with code

Regularizing Solutions to the MEG Inverse Problem Using Space-Time Separable Covariance Functions

no code implementations17 Apr 2016 Arno Solin, Pasi Jylänki, Jaakko Kauramäki, Tom Heskes, Marcel A. J. van Gerven, Simo Särkkä

We apply the method to both simulated and empirical data, and demonstrate the efficiency and generality of our Bayesian source reconstruction approach which subsumes various classical approaches in the literature.

Expectation propagation as a way of life: A framework for Bayesian inference on partitioned data

2 code implementations16 Dec 2014 Aki Vehtari, Andrew Gelman, Tuomas Sivula, Pasi Jylänki, Dustin Tran, Swupnil Sahai, Paul Blomstedt, John P. Cunningham, David Schiminovich, Christian Robert

A common divide-and-conquer approach for Bayesian computation with big data is to partition the data, perform local inference for each piece separately, and combine the results to obtain a global posterior approximation.

Bayesian Inference

Approximate Inference for Nonstationary Heteroscedastic Gaussian process Regression

no code implementations22 Apr 2014 Ville Tolvanen, Pasi Jylänki, Aki Vehtari

This paper presents a novel approach for approximate integration over the uncertainty of noise and signal variances in Gaussian process (GP) regression.

regression

Expectation Propagation for Neural Networks with Sparsity-promoting Priors

no code implementations27 Mar 2013 Pasi Jylänki, Aapo Nummenmaa, Aki Vehtari

Comparisons are made to two alternative models with ARD priors: a Gaussian process with a NN covariance function and marginal maximum a posteriori estimates of the relevance parameters, and a NN with Markov chain Monte Carlo integration over all the unknown model parameters.

Bayesian Modeling with Gaussian Processes using the GPstuff Toolbox

1 code implementation25 Jun 2012 Jarno Vanhatalo, Jaakko Riihimäki, Jouni Hartikainen, Pasi Jylänki, Ville Tolvanen, Aki Vehtari

The prior over functions is defined implicitly by the mean and covariance function, which determine the smoothness and variability of the function.

Gaussian Processes

Gaussian process regression with Student-t likelihood

no code implementations NeurIPS 2009 Jarno Vanhatalo, Pasi Jylänki, Aki Vehtari

In this work, we discuss the properties of a Gaussian process regression model with the Student-t likelihood and utilize the Laplace approximation for approximate inference.

regression

Cannot find the paper you are looking for? You can Submit a new open access paper.