Search Results for author: Markus Heinonen

Found 41 papers, 27 papers with code

Understanding deep neural networks through the lens of their non-linearity

no code implementations17 Oct 2023 Quentin Bouniot, Ievgen Redko, Anton Mallasto, Charlotte Laclau, Karol Arndt, Oliver Struckmeier, Markus Heinonen, Ville Kyrki, Samuel Kaski

The remarkable success of deep neural networks (DNN) is often attributed to their high expressive power and their ability to approximate functions of arbitrary complexity.

Learning Space-Time Continuous Neural PDEs from Partially Observed States

1 code implementation9 Jul 2023 Valerii Iakovlev, Markus Heinonen, Harri Lähdesmäki

We introduce a novel grid-independent model for learning partial differential equations (PDEs) from noisy and partial observations on irregular spatiotemporal grids.

Variational Inference

Input gradient diversity for neural network ensembles

no code implementations5 Jun 2023 Trung Trinh, Markus Heinonen, Luigi Acerbi, Samuel Kaski

Deep Ensembles (DEs) demonstrate improved accuracy, calibration and robustness to perturbations over single neural networks partly due to their functional diversity.

Ensemble Learning Image Classification +1

AbODE: Ab Initio Antibody Design using Conjoined ODEs

no code implementations31 May 2023 Yogesh Verma, Markus Heinonen, Vikas Garg

Antibodies are Y-shaped proteins that neutralize pathogens and constitute the core of our adaptive immune system.

Graph Matching Protein Folding

Uncertainty-Aware Natural Language Inference with Stochastic Weight Averaging

1 code implementation10 Apr 2023 Aarne Talman, Hande Celikkanat, Sami Virpioja, Markus Heinonen, Jörg Tiedemann

This paper introduces Bayesian uncertainty modeling using Stochastic Weight Averaging-Gaussian (SWAG) in Natural Language Understanding (NLU) tasks.

Natural Language Inference Natural Language Understanding

Modular Flows: Differential Molecular Generation

no code implementations12 Oct 2022 Yogesh Verma, Samuel Kaski, Markus Heinonen, Vikas Garg

Generating new molecules is fundamental to advancing critical applications such as drug discovery and material synthesis.

Density Estimation Drug Discovery

Latent Neural ODEs with Sparse Bayesian Multiple Shooting

1 code implementation7 Oct 2022 Valerii Iakovlev, Cagatay Yildiz, Markus Heinonen, Harri Lähdesmäki

Training dynamic models, such as neural ODEs, on long trajectories is a hard problem that requires using various tricks, such as trajectory splitting, to make model training work in practice.

Variational Inference

Incorporating functional summary information in Bayesian neural networks using a Dirichlet process likelihood approach

1 code implementation4 Jul 2022 Vishnu Raj, Tianyu Cui, Markus Heinonen, Pekka Marttinen

We present a simple approach to incorporate prior knowledge in BNNs based on external summary information about the predicted classification probabilities for a given dataset.

Variational Inference

Generative Modelling With Inverse Heat Dissipation

1 code implementation21 Jun 2022 Severi Rissanen, Markus Heinonen, Arno Solin

While diffusion models have shown great success in image generation, their noise-inverting generative process does not explicitly consider the structure of images, such as their inherent multi-scale nature.

Disentanglement Image Generation +1

Tackling covariate shift with node-based Bayesian neural networks

1 code implementation6 Jun 2022 Trung Trinh, Markus Heinonen, Luigi Acerbi, Samuel Kaski

In this paper, we interpret these latent noise variables as implicit representations of simple and domain-agnostic data perturbations during training, producing BNNs that perform well under covariate shift due to input corruptions.

Image Classification

De-randomizing MCMC dynamics with the diffusion Stein operator

no code implementations NeurIPS 2021 Zheyang Shen, Markus Heinonen, Samuel Kaski

Parallel to LD, Stein variational gradient descent (SVGD) similarly minimizes the KL, albeit endowed with a novel Stein-Wasserstein distance, by deterministically transporting a set of particle samples, thus de-randomizes the stochastic diffusion process.

Bayesian Inference

Enforcing physics-based algebraic constraints for inference of PDE models on unstructured grids

no code implementations29 Sep 2021 Valerii Iakovlev, Markus Heinonen, Harri Lähdesmäki

Data-driven neural network models have recently shown great success in modelling and learning complex PDE systems.

Evolving-Graph Gaussian Processes

1 code implementation29 Jun 2021 David Blanco-Mulero, Markus Heinonen, Ville Kyrki

Graph Gaussian Processes (GGPs) provide a data-efficient solution on graph structured domains.

Gaussian Processes regression +2

Variational multiple shooting for Bayesian ODEs with Gaussian processes

1 code implementation21 Jun 2021 Pashupati Hegde, Çağatay Yıldız, Harri Lähdesmäki, Samuel Kaski, Markus Heinonen

Recent machine learning advances have proposed black-box estimation of unknown continuous-time system dynamics directly from data.

Bayesian Inference Gaussian Processes +1

Affine Transport for Sim-to-Real Domain Adaptation

no code implementations25 May 2021 Anton Mallasto, Karol Arndt, Markus Heinonen, Samuel Kaski, Ville Kyrki

In this paper, we present affine transport -- a variant of optimal transport, which models the mapping between state transition distributions between the source and target domains with an affine transformation.

Domain Adaptation OpenAI Gym

Continuous-Time Model-Based Reinforcement Learning

1 code implementation9 Feb 2021 Çağatay Yıldız, Markus Heinonen, Harri Lähdesmäki

Model-based reinforcement learning (MBRL) approaches rely on discrete-time state transition models whereas physical systems and the vast majority of control tasks operate in continuous-time.

Model-based Reinforcement Learning reinforcement-learning +1

Scalable Bayesian neural networks by layer-wise input augmentation

1 code implementation26 Oct 2020 Trung Trinh, Samuel Kaski, Markus Heinonen

We introduce implicit Bayesian neural networks, a simple and scalable approach for uncertainty representation in deep learning.

Image Classification

Bayesian Inference for Optimal Transport with Stochastic Cost

no code implementations19 Oct 2020 Anton Mallasto, Markus Heinonen, Samuel Kaski

In machine learning and computer vision, optimal transport has had significant success in learning generative models and defining metric distances between structured and stochastic data objects, that can be cast as probability measures.

Bayesian Inference

Likelihood-Free Inference with Deep Gaussian Processes

1 code implementation18 Jun 2020 Alexander Aushev, Henri Pesonen, Markus Heinonen, Jukka Corander, Samuel Kaski

In recent years, surrogate models have been successfully used in likelihood-free inference to decrease the number of simulator evaluations.

Bayesian Optimization Gaussian Processes

Learning continuous-time PDEs from sparse data with graph neural networks

1 code implementation ICLR 2021 Valerii Iakovlev, Markus Heinonen, Harri Lähdesmäki

We demonstrate the model's ability to work with unstructured grids, arbitrary time steps, and noisy observations.

Sparse Gaussian Processes Revisited: Bayesian Approaches to Inducing-Variable Approximations

no code implementations6 Mar 2020 Simone Rossi, Markus Heinonen, Edwin V. Bonilla, Zheyang Shen, Maurizio Filippone

Variational inference techniques based on inducing variables provide an elegant framework for scalable posterior estimation in Gaussian process (GP) models.

Gaussian Processes Variational Inference

ODE2VAE: Deep generative second order ODEs with Bayesian neural networks

1 code implementation NeurIPS 2019 Cagatay Yildiz, Markus Heinonen, Harri Lahdesmaki

We present Ordinary Differential Equation Variational Auto-Encoder (ODE2VAE), a latent second order ODE model for high-dimensional sequential data.

Imputation motion prediction +2

ODE$^2$VAE: Deep generative second order ODEs with Bayesian neural networks

1 code implementation27 May 2019 Çağatay Yıldız, Markus Heinonen, Harri Lähdesmäki

We present Ordinary Differential Equation Variational Auto-Encoder (ODE$^2$VAE), a latent second order ODE model for high-dimensional sequential data.

Imputation motion prediction +3

Learning spectrograms with convolutional spectral kernels

no code implementations23 May 2019 Zheyang Shen, Markus Heinonen, Samuel Kaski

We introduce the convolutional spectral kernel (CSK), a novel family of non-stationary, nonparametric covariance kernels for Gaussian process (GP) models, derived from the convolution between two imaginary radial basis functions.

Gaussian Processes

Neural Non-Stationary Spectral Kernel

1 code implementation27 Nov 2018 Sami Remes, Markus Heinonen, Samuel Kaski

Spectral mixture kernels have been proposed as general-purpose, flexible kernels for learning and discovering more complicated patterns in the data.

Gaussian Processes

Harmonizable mixture kernels with variational Fourier features

no code implementations10 Oct 2018 Zheyang Shen, Markus Heinonen, Samuel Kaski

The expressive power of Gaussian processes depends heavily on the choice of kernel.

Gaussian Processes

Deep learning with differential Gaussian process flows

1 code implementation9 Oct 2018 Pashupati Hegde, Markus Heinonen, Harri Lähdesmäki, Samuel Kaski

We propose a novel deep learning paradigm of differential flows that learn a stochastic differential equation transformations of inputs prior to a standard classification or regression function.

Gaussian Processes General Classification +1

Deep convolutional Gaussian processes

1 code implementation6 Oct 2018 Kenneth Blomqvist, Samuel Kaski, Markus Heinonen

We propose deep convolutional Gaussian processes, a deep Gaussian process architecture with convolutional structure.

Classification Gaussian Processes +2

Learning Stochastic Differential Equations With Gaussian Processes Without Gradient Matching

1 code implementation16 Jul 2018 Cagatay Yildiz, Markus Heinonen, Jukka Intosalmi, Henrik Mannerström, Harri Lähdesmäki

We introduce a novel paradigm for learning non-parametric drift and diffusion functions for stochastic differential equation (SDE).

Gaussian Processes

Bayesian Metabolic Flux Analysis reveals intracellular flux couplings

1 code implementation18 Apr 2018 Markus Heinonen, Maria Osmala, Henrik Mannerström, Janne Wallenius, Samuel Kaski, Juho Rousu, Harri Lähdesmäki

Flux analysis methods commonly place unrealistic assumptions on fluxes due to the convenience of formulating the problem as a linear programming model, and most methods ignore the notable uncertainty in flux estimates.

Variational zero-inflated Gaussian processes with sparse kernels

1 code implementation13 Mar 2018 Pashupati Hegde, Markus Heinonen, Samuel Kaski

We propose a novel model family of zero-inflated Gaussian processes (ZiGP) for such zero-inflated datasets, produced by sparse kernels through learning a latent probit Gaussian process that can zero out kernel rows and columns whenever the signal is absent.

Gaussian Processes Variational Inference

mGPfusion: Predicting protein stability changes with Gaussian process kernel learning and data fusion

1 code implementation8 Feb 2018 Emmi Jokinen, Markus Heinonen, Harri Lähdesmäki

We introduce a Bayesian data fusion model that re-calibrates the experimental and in silico data sources and then learns a predictive GP model from the combined data.

Protein Design

Random Fourier Features for Operator-Valued Kernels

no code implementations9 May 2016 Romain Brault, Florence d'Alché-Buc, Markus Heinonen

Devoted to multi-task learning and structured output learning, operator-valued kernels provide a flexible tool to build vector-valued functions in the context of Reproducing Kernel Hilbert Spaces.

Multi-Task Learning Translation

Non-Stationary Gaussian Process Regression with Hamiltonian Monte Carlo

1 code implementation18 Aug 2015 Markus Heinonen, Henrik Mannerström, Juho Rousu, Samuel Kaski, Harri Lähdesmäki

We present a novel approach for fully non-stationary Gaussian process regression (GPR), where all three key parameters -- noise variance, signal variance and lengthscale -- can be simultaneously input-dependent.

GPR regression

Learning nonparametric differential equations with operator-valued kernels and gradient matching

no code implementations19 Nov 2014 Markus Heinonen, Florence d'Alché-Buc

Modeling dynamical systems with ordinary differential equations implies a mechanistic view of the process underlying the dynamics.


Cannot find the paper you are looking for? You can Submit a new open access paper.