Search Results for author: Arno Solin

Found 78 papers, 50 papers with code

Streamlining Prediction in Bayesian Deep Learning

no code implementations27 Nov 2024 Rui Li, Marcus Klasson, Arno Solin, Martin Trapp

The rising interest in Bayesian deep learning (BDL) has led to a plethora of methods for estimating the posterior distribution.

Deep Learning

Differentially Private Continual Learning using Pre-Trained Models

no code implementations7 Nov 2024 Marlon Tobaben, Marcus Klasson, Rui Li, Arno Solin, Antti Honkela

This work explores the intersection of continual learning (CL) and differential privacy (DP).

Continual Learning

Plan$\times$RAG: Planning-guided Retrieval Augmented Generation

no code implementations28 Oct 2024 Prakhar Verma, Sukruta Prakash Midigeshi, Gaurav Sinha, Arno Solin, Nagarajan Natarajan, Amit Sharma

We introduce Planning-guided Retrieval Augmented Generation (Plan$\times$RAG), a novel framework that augments the \emph{retrieve-then-reason} paradigm of existing RAG frameworks to \emph{plan-then-retrieve}.

Answer Generation RAG +1

Free Hunch: Denoiser Covariance Estimation for Diffusion Models Without Extra Costs

no code implementations15 Oct 2024 Severi Rissanen, Markus Heinonen, Arno Solin

The covariance for clean data given a noisy observation is an important quantity in many conditional generation methods for diffusion models.

Physics-Informed Variational State-Space Gaussian Processes

no code implementations20 Sep 2024 Oliver Hamelijnck, Arno Solin, Theodoros Damoulas

Differential equations are important mechanistic models that are integral to many scientific and engineering applications.

Gaussian Processes

Sources of Uncertainty in 3D Scene Reconstruction

1 code implementation10 Sep 2024 Marcus Klasson, Riccardo Mereu, Juho Kannala, Arno Solin

The process of 3D scene reconstruction can be affected by numerous uncertainty sources in real-world scenes.

3D Reconstruction 3D Scene Reconstruction

Exploiting Hankel-Toeplitz Structures for Fast Computation of Kernel Precision Matrices

no code implementations5 Aug 2024 Frida Viset, Anton Kullberg, Frederiek Wesel, Arno Solin

In this paper, we lower this dominating computational complexity to $\mathcal{O}(NM)$ with no additional approximations.

Hyperparameter Optimization

Improving Discrete Diffusion Models via Structured Preferential Generation

no code implementations28 May 2024 Severi Rissanen, Markus Heinonen, Arno Solin

In the domains of image and audio, diffusion models have shown impressive performance.

Alignment is Key for Applying Diffusion Models to Retrosynthesis

no code implementations27 May 2024 Najwa Laabid, Severi Rissanen, Markus Heinonen, Arno Solin, Vikas Garg

Retrosynthesis, the task of identifying precursors for a given molecule, can be naturally framed as a conditional graph generation task.

Graph Generation Retrosynthesis

Flatness Improves Backbone Generalisation in Few-shot Classification

no code implementations11 Apr 2024 Rui Li, Martin Trapp, Marcus Klasson, Arno Solin

Deployment of deep neural networks in real-world settings typically requires adaptation to new tasks with few examples.

Classification

Gaussian Splatting on the Move: Blur and Rolling Shutter Compensation for Natural Camera Motion

1 code implementation20 Mar 2024 Otto Seiskari, Jerry Ylilammi, Valtteri Kaatrasalo, Pekka Rantalankila, Matias Turkulainen, Juho Kannala, Esa Rahtu, Arno Solin

High-quality scene reconstruction and novel view synthesis based on Gaussian Splatting (3DGS) typically require steady, high-quality photographs, often impractical to capture with handheld cameras.

Novel View Synthesis

Function-space Parameterization of Neural Networks for Sequential Learning

1 code implementation16 Mar 2024 Aidan Scannell, Riccardo Mereu, Paul Chang, Ella Tamir, Joni Pajarinen, Arno Solin

Our parameterization offers: (i) a way to scale function-space methods to large data sets via sparsification, (ii) retention of prior knowledge when access to past data is limited, and (iii) a mechanism to incorporate new data without retraining.

Continual Learning Gaussian Processes +1

Subtractive Mixture Models via Squaring: Representation and Learning

2 code implementations1 Oct 2023 Lorenzo Loconte, Aleksanteri M. Sladek, Stefan Mengel, Martin Trapp, Arno Solin, Nicolas Gillis, Antonio Vergari

Mixture models are traditionally represented and learned by adding several distributions as components.

Sparse Function-space Representation of Neural Networks

2 code implementations5 Sep 2023 Aidan Scannell, Riccardo Mereu, Paul Chang, Ella Tamir, Joni Pajarinen, Arno Solin

Deep neural networks (NNs) are known to lack uncertainty estimates and struggle to incorporate new data.

Improving Hyperparameter Learning under Approximate Inference in Gaussian Process Models

1 code implementation7 Jun 2023 Rui Li, ST John, Arno Solin

Approximate inference in Gaussian process (GP) models with non-conjugate likelihoods gets entangled with the learning of the model hyperparameters.

Hyperparameter Optimization Variational Inference

Memory-Based Dual Gaussian Processes for Sequential Learning

1 code implementation6 Jun 2023 Paul E. Chang, Prakhar Verma, S. T. John, Arno Solin, Mohammad Emtiyaz Khan

Sequential learning with Gaussian processes (GPs) is challenging when access to past data is limited, for example, in continual and active learning.

Active Learning Bayesian Optimization +2

Variational Gaussian Process Diffusion Processes

1 code implementation3 Jun 2023 Prakhar Verma, Vincent Adam, Arno Solin

Diffusion processes are a class of stochastic differential equations (SDEs) providing a rich family of expressive models that arise naturally in dynamic modelling tasks.

Variational Inference

PriorCVAE: scalable MCMC parameter inference with Bayesian deep generative modelling

2 code implementations9 Apr 2023 Elizaveta Semenova, Prakhar Verma, Max Cairney-Leeming, Arno Solin, Samir Bhatt, Seth Flaxman

Recent advances have shown that GP priors, or their finite realisations, can be encoded using deep generative models such as variational autoencoders (VAEs).

Bayesian Inference Gaussian Processes

Fixing Overconfidence in Dynamic Neural Networks

1 code implementation13 Feb 2023 Lassi Meronen, Martin Trapp, Andrea Pilzer, Le Yang, Arno Solin

Dynamic neural networks are a recent technique that promises a remedy for the increasing size of modern deep learning models by dynamically adapting their computational cost to the difficulty of the inputs.

Decision Making Deep Learning +1

Transport with Support: Data-Conditional Diffusion Bridges

1 code implementation31 Jan 2023 Ella Tamir, Martin Trapp, Arno Solin

We integrate Bayesian filtering and optimal control into learning the diffusion process, enabling the generation of constrained stochastic processes governed by sparse observations at intermediate stages and terminal constraints.

Time Series

MixupE: Understanding and Improving Mixup from Directional Derivative Perspective

1 code implementation27 Dec 2022 Yingtian Zou, Vikas Verma, Sarthak Mittal, Wai Hoh Tang, Hieu Pham, Juho Kannala, Yoshua Bengio, Arno Solin, Kenji Kawaguchi

Mixup is a popular data augmentation technique for training deep neural networks where additional samples are generated by linearly interpolating pairs of inputs and their labels.

Data Augmentation

Towards Improved Learning in Gaussian Processes: The Best of Two Worlds

no code implementations11 Nov 2022 Rui Li, ST John, Arno Solin

Gaussian process training decomposes into inference of the (approximate) posterior and learning of the hyperparameters.

Binary Classification Gaussian Processes +3

Fantasizing with Dual GPs in Bayesian Optimization and Active Learning

no code implementations2 Nov 2022 Paul E. Chang, Prakhar Verma, ST John, Victor Picheny, Henry Moss, Arno Solin

Gaussian processes (GPs) are the main surrogate functions used for sequential modelling such as Bayesian Optimization and Active Learning.

Active Learning Bayesian Optimization +1

Uncertainty-guided Source-free Domain Adaptation

1 code implementation16 Aug 2022 Subhankar Roy, Martin Trapp, Andrea Pilzer, Juho Kannala, Nicu Sebe, Elisa Ricci, Arno Solin

Source-free domain adaptation (SFDA) aims to adapt a classifier to an unlabelled target data set by only using a pre-trained source model.

Source-Free Domain Adaptation

Generative Modelling With Inverse Heat Dissipation

1 code implementation21 Jun 2022 Severi Rissanen, Markus Heinonen, Arno Solin

While diffusion models have shown great success in image generation, their noise-inverting generative process does not explicitly consider the structure of images, such as their inherent multi-scale nature.

Disentanglement Image Generation +1

Disentangling Model Multiplicity in Deep Learning

no code implementations17 Jun 2022 Ari Heljakka, Martin Trapp, Juho Kannala, Arno Solin

This observed 'predictive' multiplicity (PM) also implies elusive differences in the internals of the models, their 'representational' multiplicity (RM).

Deep Learning

A Look at Improving Robustness in Visual-inertial SLAM by Moment Matching

no code implementations27 May 2022 Arno Solin, Rui Li, Andrea Pilzer

The fusion of camera sensor and inertial data is a leading method for ego-motion tracking in autonomous and smart devices.

Non-separable Spatio-temporal Graph Kernels via SPDEs

no code implementations16 Nov 2021 Alexander Nikitin, ST John, Arno Solin, Samuel Kaski

Gaussian processes (GPs) provide a principled and direct approach for inference and learning on graphs.

Gaussian Processes

Dual Parameterization of Sparse Variational Gaussian Processes

1 code implementation NeurIPS 2021 Vincent Adam, Paul E. Chang, Mohammad Emtiyaz Khan, Arno Solin

Sparse variational Gaussian process (SVGP) methods are a common choice for non-conjugate Gaussian process inference because of their computational benefits.

Computational Efficiency Gaussian Processes

Bayes-Newton Methods for Approximate Bayesian Inference with PSD Guarantees

1 code implementation2 Nov 2021 William J. Wilkinson, Simo Särkkä, Arno Solin

We formulate natural gradient variational inference (VI), expectation propagation (EP), and posterior linearisation (PL) as extensions of Newton's method for optimising the parameters of a Bayesian posterior distribution.

Bayesian Inference Gaussian Processes +3

Spatio-Temporal Variational Gaussian Processes

1 code implementation NeurIPS 2021 Oliver Hamelijnck, William J. Wilkinson, Niki A. Loppi, Arno Solin, Theodoros Damoulas

We introduce a scalable approach to Gaussian process inference that combines spatio-temporal filtering with natural gradient variational inference, resulting in a non-conjugate GP method for multivariate data that scales linearly with respect to time.

Gaussian Processes Variational Inference

Scalable Inference in SDEs by Direct Matching of the Fokker-Planck-Kolmogorov Equation

1 code implementation NeurIPS 2021 Arno Solin, Ella Tamir, Prakhar Verma

Simulation-based techniques such as variants of stochastic Runge-Kutta are the de facto approach for inference with stochastic differential equations (SDEs) in machine learning.

Periodic Activation Functions Induce Stationarity

2 code implementations NeurIPS 2021 Lassi Meronen, Martin Trapp, Arno Solin

Neural network models are known to reinforce hidden data biases, making them unreliable and difficult to interpret.

Translation

Sparse Gaussian Processes for Stochastic Differential Equations

no code implementations NeurIPS Workshop DLDE 2021 Prakhar Verma, Vincent Adam, Arno Solin

We frame the problem of learning stochastic differential equations (SDEs) from noisy observations as an inference problem and aim to maximize the marginal likelihood of the observations in a joint model of the latent paths and the noisy observations.

Gaussian Processes Variational Inference

HybVIO: Pushing the Limits of Real-time Visual-inertial Odometry

1 code implementation22 Jun 2021 Otto Seiskari, Pekka Rantalankila, Juho Kannala, Jerry Ylilammi, Esa Rahtu, Arno Solin

We present HybVIO, a novel hybrid approach for combining filtering-based visual-inertial odometry (VIO) with optimization-based SLAM.

Combining Pseudo-Point and State Space Approximations for Sum-Separable Gaussian Processes

1 code implementation pproximateinference AABI Symposium 2021 Will Tebbutt, Arno Solin, Richard E. Turner

Pseudo-point approximations, one of the gold-standard methods for scaling GPs to large data sets, are well suited for handling off-the-grid spatial data.

Epidemiology Gaussian Processes +2

Scalable Inference in SDEs by Direct Matching of the Fokker–Planck–Kolmogorov Equation

1 code implementation NeurIPS 2021 Arno Solin, Ella Maija Tamir, Prakhar Verma

Simulation-based techniques such as variants of stochastic Runge–Kutta are the de facto approach for inference with stochastic differential equations (SDEs) in machine learning.

Sparse Algorithms for Markovian Gaussian Processes

1 code implementation19 Mar 2021 William J. Wilkinson, Arno Solin, Vincent Adam

Approximate Bayesian inference methods that scale to very large datasets are crucial in leveraging probabilistic models for real-world time series.

Bayesian Inference Gaussian Processes +3

Novel View Synthesis via Depth-guided Skip Connections

1 code implementation5 Jan 2021 Yuxin Hou, Arno Solin, Juho Kannala

Flow predictions enable the target view to re-use pixels directly, but can easily lead to distorted results.

Decoder Novel View Synthesis

Stationary Activations for Uncertainty Calibration in Deep Learning

1 code implementation NeurIPS 2020 Lassi Meronen, Christabella Irwanto, Arno Solin

We introduce a new family of non-linear neural network activation functions that mimic the properties induced by the widely-used Mat\'ern family of kernels in Gaussian process (GP) models.

Deep Learning General Classification

Movement-induced Priors for Deep Stereo

1 code implementation18 Oct 2020 Yuxin Hou, Muhammad Kamran Janjua, Juho Kannala, Arno Solin

We propose a method for fusing stereo disparity estimation with movement-induced prior information.

Decoder Disparity Estimation +1

State Space Expectation Propagation: Efficient Inference Schemes for Temporal Gaussian Processes

1 code implementation ICML 2020 William J. Wilkinson, Paul E. Chang, Michael Riis Andersen, Arno Solin

EP provides some benefits over the traditional methods via introduction of the so-called cavity distribution, and we combine these benefits with the computational efficiency of linearisation, providing extensive empirical analysis demonstrating the efficacy of various algorithms under this unifying framework.

Bayesian Inference Computational Efficiency +2

Fast Variational Learning in State-Space Gaussian Process Models

1 code implementation9 Jul 2020 Paul E. Chang, William J. Wilkinson, Mohammad Emtiyaz Khan, Arno Solin

Gaussian process (GP) regression with 1D inputs can often be performed in linear time via a stochastic differential equation formulation.

Time Series Time Series Analysis +1

Movement Tracking by Optical Flow Assisted Inertial Navigation

no code implementations24 Jun 2020 Lassi Meronen, William J. Wilkinson, Arno Solin

We consider a visually dense approach, where the IMU data is fused with the dense optical flow field estimated from the camera data.

Optical Flow Estimation Probabilistic Deep Learning

Deep Residual Mixture Models

1 code implementation22 Jun 2020 Perttu Hämäläinen, Martin Trapp, Tuure Saloheimo, Arno Solin

We propose Deep Residual Mixture Models (DRMMs), a novel deep generative model architecture.

BIG-bench Machine Learning

Deep Automodulators

2 code implementations NeurIPS 2020 Ari Heljakka, Yuxin Hou, Juho Kannala, Arno Solin

These networks can faithfully reproduce individual real-world input images like regular autoencoders, but also generate a fused sample from an arbitrary combination of several such images, allowing instantaneous 'style-mixing' and other new applications.

Decoder Disentanglement

Gaussian Process Priors for View-Aware Inference

1 code implementation6 Dec 2019 Yuxin Hou, Ari Heljakka, Arno Solin

While frame-independent predictions with deep neural networks have become the prominent solutions to many computer vision tasks, the potential benefits of utilizing correlations between frames have received less attention.

Novel View Synthesis Translation

Scalable Exact Inference in Multi-Output Gaussian Processes

3 code implementations ICML 2020 Wessel P. Bruinsma, Eric Perim, Will Tebbutt, J. Scott Hosking, Arno Solin, Richard E. Turner

Multi-output Gaussian processes (MOGPs) leverage the flexibility and interpretability of GPs while capturing structure across outputs, which is desirable, for example, in spatio-temporal modelling.

Gaussian Processes

Global Approximate Inference via Local Linearisation for Temporal Gaussian Processes

no code implementations pproximateinference AABI Symposium 2019 William J. Wilkinson, Paul E. Chang, Michael Riis Andersen, Arno Solin

The extended Kalman filter (EKF) is a classical signal processing algorithm which performs efficient approximate Bayesian inference in non-conjugate models by linearising the local measurement function, avoiding the need to compute intractable integrals when calculating the posterior.

Bayesian Inference Gaussian Processes +1

Iterative Path Reconstruction for Large-Scale Inertial Navigation on Smartphones

no code implementations2 Jun 2019 Santiago Cortés Reina, Yuxin Hou, Juho Kannala, Arno Solin

Modern smartphones have all the sensing capabilities required for accurate and robust navigation and tracking.

Motion Estimation

Towards Photographic Image Manipulation with Balanced Growing of Generative Autoencoders

1 code implementation12 Apr 2019 Ari Heljakka, Arno Solin, Juho Kannala

retaining the identity of a face), sharp generated/reconstructed samples in high resolutions, and a well-structured latent space that supports semantic manipulation of the inputs.

Attribute Disentanglement +1

Multi-View Stereo by Temporal Nonparametric Fusion

1 code implementation ICCV 2019 Yuxin Hou, Juho Kannala, Arno Solin

The flexibility of the Gaussian process (GP) prior provides adapting memory for fusing information from previous views.

Decoder Depth Estimation +1

Know Your Boundaries: Constraining Gaussian Processes by Variational Harmonic Features

1 code implementation10 Apr 2019 Arno Solin, Manon Kok

Gaussian processes (GPs) provide a powerful framework for extrapolation, interpolation, and noise removal in regression and classification.

Gaussian Processes General Classification +1

Interpolation Consistency Training for Semi-Supervised Learning

4 code implementations9 Mar 2019 Vikas Verma, Kenji Kawaguchi, Alex Lamb, Juho Kannala, Arno Solin, Yoshua Bengio, David Lopez-Paz

We introduce Interpolation Consistency Training (ICT), a simple and computation efficient algorithm for training Deep Neural Networks in the semi-supervised learning paradigm.

General Classification Semi-Supervised Image Classification

Unstructured Multi-View Depth Estimation Using Mask-Based Multiplane Representation

1 code implementation6 Feb 2019 Yuxin Hou, Arno Solin, Juho Kannala

This paper presents a novel method, MaskMVS, to solve depth estimation for unstructured multi-view image-pose pairs.

Depth Estimation

End-to-End Probabilistic Inference for Nonstationary Audio Analysis

1 code implementation31 Jan 2019 William J. Wilkinson, Michael Riis Andersen, Joshua D. Reiss, Dan Stowell, Arno Solin

A typical audio signal processing pipeline includes multiple disjoint analysis stages, including calculation of a time-frequency representation followed by spectrogram-based feature analysis.

Audio Signal Processing regression

Infinite-Horizon Gaussian Processes

1 code implementation NeurIPS 2018 Arno Solin, James Hensman, Richard E. Turner

The complexity is still cubic in the state dimension $m$ which is an impediment to practical application.

Gaussian Processes

Unifying Probabilistic Models for Time-Frequency Analysis

1 code implementation6 Nov 2018 William J. Wilkinson, Michael Riis Andersen, Joshua D. Reiss, Dan Stowell, Arno Solin

In audio signal processing, probabilistic time-frequency models have many benefits over their non-probabilistic counterparts.

Audio Signal Processing Gaussian Processes +1

Deep Learning Based Speed Estimation for Constraining Strapdown Inertial Navigation on Smartphones

1 code implementation10 Aug 2018 Santiago Cortés, Arno Solin, Juho Kannala

Strapdown inertial navigation systems are sensitive to the quality of the data provided by the accelerometer and gyroscope.

ADVIO: An authentic dataset for visual-inertial odometry

1 code implementation ECCV 2018 Santiago Cortés, Arno Solin, Esa Rahtu, Juho Kannala

The lack of realistic and open benchmarking datasets for pedestrian visual-inertial odometry has made it hard to pinpoint differences in published methods.

Benchmarking

Pioneer Networks: Progressively Growing Generative Autoencoder

1 code implementation9 Jul 2018 Ari Heljakka, Arno Solin, Juho Kannala

Instead, we propose the Progressively Growing Generative Autoencoder (PIONEER) network which achieves high-quality reconstruction with $128{\times}128$ images without requiring a GAN discriminator.

Image Generation

Robust Gyroscope-Aided Camera Self-Calibration

1 code implementation31 May 2018 Santiago Cortés Reina, Arno Solin, Juho Kannala

This application paper proposes a model for estimating the parameters on the fly by fusing gyroscope and camera data, both readily available in modern day smartphones.

Camera Calibration Video Stabilization

Scalable Magnetic Field SLAM in 3D Using Gaussian Process Maps

no code implementations5 Apr 2018 Manon Kok, Arno Solin

We present a method for scalable and fully 3D magnetic field simultaneous localisation and mapping (SLAM) using local anomalies in the magnetic field as a source of position information.

Position

Recursive Chaining of Reversible Image-to-image Translators For Face Aging

2 code implementations14 Feb 2018 Ari Heljakka, Arno Solin, Juho Kannala

By treating the age phases as a sequence of image domains, we construct a chain of transformers that map images from one age domain to the next.

Translation

State Space Gaussian Processes with Non-Gaussian Likelihood

no code implementations ICML 2018 Hannes Nickisch, Arno Solin, Alexander Grigorievskiy

We provide a comprehensive overview and tooling for GP modeling with non-Gaussian likelihoods using state space methods.

Gaussian Processes regression

Inertial Odometry on Handheld Smartphones

1 code implementation1 Mar 2017 Arno Solin, Santiago Cortes, Esa Rahtu, Juho Kannala

Building a complete inertial navigation system using the limited quality data provided by current smartphones has been regarded challenging, if not impossible.

Variational Fourier features for Gaussian processes

1 code implementation21 Nov 2016 James Hensman, Nicolas Durrande, Arno Solin

This work brings together two powerful concepts in Gaussian processes: the variational approach to sparse approximation and the spectral representation of Gaussian processes.

Gaussian Processes

Regularizing Solutions to the MEG Inverse Problem Using Space-Time Separable Covariance Functions

no code implementations17 Apr 2016 Arno Solin, Pasi Jylänki, Jaakko Kauramäki, Tom Heskes, Marcel A. J. van Gerven, Simo Särkkä

We apply the method to both simulated and empirical data, and demonstrate the efficiency and generality of our Bayesian source reconstruction approach which subsumes various classical approaches in the literature.

Computationally Efficient Bayesian Learning of Gaussian Process State Space Models

no code implementations7 Jun 2015 Andreas Svensson, Arno Solin, Simo Särkkä, Thomas B. Schön

We present a procedure for efficient Bayesian learning in Gaussian process state space models, where the representation is formed by projecting the problem onto a set of approximate eigenfunctions derived from the prior covariance structure.

Gaussian Processes State Space Models

Sigma-Point Filtering and Smoothing Based Parameter Estimation in Nonlinear Dynamic Systems

1 code implementation23 Apr 2015 Juho Kokkala, Arno Solin, Simo Särkkä

We consider approximate maximum likelihood parameter estimation in nonlinear state-space models.

Methodology Dynamical Systems Optimization and Control Computation

Hilbert Space Methods for Reduced-Rank Gaussian Process Regression

2 code implementations21 Jan 2014 Arno Solin, Simo Särkkä

On this approximate eigenbasis the eigenvalues of the covariance function can be expressed as simple functions of the spectral density of the Gaussian process, which allows the GP inference to be solved under a computational cost scaling as $\mathcal{O}(nm^2)$ (initial) and $\mathcal{O}(m^3)$ (hyperparameter learning) with $m$ basis functions and $n$ data points.

regression

Cannot find the paper you are looking for? You can Submit a new open access paper.