no code implementations • 13 Feb 2023 • Lassi Meronen, Martin Trapp, Andrea Pilzer, Le Yang, Arno Solin
Dynamic neural networks are a recent technique that promises a remedy for the increasing size of modern deep learning models by dynamically adapting their computational cost to the difficulty of the input samples.
no code implementations • 31 Jan 2023 • Ella Tamir, Martin Trapp, Arno Solin
The dynamic Schr\"odinger bridge problem provides an appealing setting for solving optimal transport problems by learning non-linear diffusion processes using efficient iterative solvers.
no code implementations • 27 Dec 2022 • Vikas Verma, Sarthak Mittal, Wai Hoh Tang, Hieu Pham, Juho Kannala, Yoshua Bengio, Arno Solin, Kenji Kawaguchi
Mixup is a popular data augmentation technique for training deep neural networks where additional samples are generated by linearly interpolating pairs of inputs and their labels.
no code implementations • 11 Nov 2022 • Rui Li, ST John, Arno Solin
Gaussian process training decomposes into inference of the (approximate) posterior and learning of the hyperparameters.
no code implementations • 2 Nov 2022 • Paul E. Chang, Prakhar Verma, ST John, Victor Picheny, Henry Moss, Arno Solin
Gaussian processes (GPs) are the main surrogate functions used for sequential modelling such as Bayesian Optimization and Active Learning.
no code implementations • 1 Nov 2022 • Andrea Pilzer, Yuxin Hou, Niki Loppi, Arno Solin, Juho Kannala
We introduce visual hints expansion for guiding stereo matching to improve generalization.
1 code implementation • 16 Aug 2022 • Subhankar Roy, Martin Trapp, Andrea Pilzer, Juho Kannala, Nicu Sebe, Elisa Ricci, Arno Solin
Source-free domain adaptation (SFDA) aims to adapt a classifier to an unlabelled target data set by only using a pre-trained source model.
no code implementations • 21 Jun 2022 • Severi Rissanen, Markus Heinonen, Arno Solin
While diffusion models have shown great success in image generation, their noise-inverting generative process does not explicitly consider the structure of images, such as their inherent multi-scale nature.
no code implementations • 17 Jun 2022 • Ari Heljakka, Martin Trapp, Juho Kannala, Arno Solin
This observed 'predictive' multiplicity (PM) also implies elusive differences in the internals of the models, their 'representational' multiplicity (RM).
no code implementations • 27 May 2022 • Arno Solin, Rui Li, Andrea Pilzer
The fusion of camera sensor and inertial data is a leading method for ego-motion tracking in autonomous and smart devices.
no code implementations • 16 Nov 2021 • Alexander Nikitin, ST John, Arno Solin, Samuel Kaski
Gaussian processes (GPs) provide a principled and direct approach for inference and learning on graphs.
1 code implementation • NeurIPS 2021 • Vincent Adam, Paul E. Chang, Mohammad Emtiyaz Khan, Arno Solin
Sparse variational Gaussian process (SVGP) methods are a common choice for non-conjugate Gaussian process inference because of their computational benefits.
1 code implementation • NeurIPS 2021 • Oliver Hamelijnck, William J. Wilkinson, Niki A. Loppi, Arno Solin, Theodoros Damoulas
We introduce a scalable approach to Gaussian process inference that combines spatio-temporal filtering with natural gradient variational inference, resulting in a non-conjugate GP method for multivariate data that scales linearly with respect to time.
1 code implementation • 2 Nov 2021 • William J. Wilkinson, Simo Särkkä, Arno Solin
We formulate natural gradient variational inference (VI), expectation propagation (EP), and posterior linearisation (PL) as extensions of Newton's method for optimising the parameters of a Bayesian posterior distribution.
1 code implementation • NeurIPS 2021 • Arno Solin, Ella Tamir, Prakhar Verma
Simulation-based techniques such as variants of stochastic Runge-Kutta are the de facto approach for inference with stochastic differential equations (SDEs) in machine learning.
2 code implementations • NeurIPS 2021 • Lassi Meronen, Martin Trapp, Arno Solin
Neural network models are known to reinforce hidden data biases, making them unreliable and difficult to interpret.
no code implementations • 29 Sep 2021 • Dmitry Senushkin, Iaroslav Melekhov, Mikhail Romanov, Anton Konushin, Juho Kannala, Arno Solin
We present a novel gradient-based multi-task learning (MTL) approach that balances training in multi-task systems by aligning the independent components of the training objective.
no code implementations • NeurIPS Workshop DLDE 2021 • Prakhar Verma, Vincent Adam, Arno Solin
We frame the problem of learning stochastic differential equations (SDEs) from noisy observations as an inference problem and aim to maximize the marginal likelihood of the observations in a joint model of the latent paths and the noisy observations.
1 code implementation • 22 Jun 2021 • Otto Seiskari, Pekka Rantalankila, Juho Kannala, Jerry Ylilammi, Esa Rahtu, Arno Solin
We present HybVIO, a novel hybrid approach for combining filtering-based visual-inertial odometry (VIO) with optimization-based SLAM.
1 code implementation • pproximateinference AABI Symposium 2021 • Will Tebbutt, Arno Solin, Richard E. Turner
Pseudo-point approximations, one of the gold-standard methods for scaling GPs to large data sets, are well suited for handling off-the-grid spatial data.
1 code implementation • NeurIPS 2021 • Arno Solin, Ella Maija Tamir, Prakhar Verma
Simulation-based techniques such as variants of stochastic Runge–Kutta are the de facto approach for inference with stochastic differential equations (SDEs) in machine learning.
1 code implementation • 19 Mar 2021 • William J. Wilkinson, Arno Solin, Vincent Adam
Approximate Bayesian inference methods that scale to very large datasets are crucial in leveraging probabilistic models for real-world time series.
1 code implementation • 5 Jan 2021 • Yuxin Hou, Arno Solin, Juho Kannala
Flow predictions enable the target view to re-use pixels directly, but can easily lead to distorted results.
1 code implementation • 5 Nov 2020 • Rinu Boney, Jussi Sainio, Mikko Kaivola, Arno Solin, Juho Kannala
We validate the platform with reinforcement learning experiments and provide baseline results on a set of benchmark tasks.
1 code implementation • NeurIPS 2020 • Lassi Meronen, Christabella Irwanto, Arno Solin
We introduce a new family of non-linear neural network activation functions that mimic the properties induced by the widely-used Mat\'ern family of kernels in Gaussian process (GP) models.
1 code implementation • 18 Oct 2020 • Yuxin Hou, Muhammad Kamran Janjua, Juho Kannala, Arno Solin
We propose a method for fusing stereo disparity estimation with movement-induced prior information.
1 code implementation • ICML 2020 • William J. Wilkinson, Paul E. Chang, Michael Riis Andersen, Arno Solin
EP provides some benefits over the traditional methods via introduction of the so-called cavity distribution, and we combine these benefits with the computational efficiency of linearisation, providing extensive empirical analysis demonstrating the efficacy of various algorithms under this unifying framework.
1 code implementation • 9 Jul 2020 • Paul E. Chang, William J. Wilkinson, Mohammad Emtiyaz Khan, Arno Solin
Gaussian process (GP) regression with 1D inputs can often be performed in linear time via a stochastic differential equation formulation.
no code implementations • 24 Jun 2020 • Lassi Meronen, William J. Wilkinson, Arno Solin
We consider a visually dense approach, where the IMU data is fused with the dense optical flow field estimated from the camera data.
1 code implementation • 22 Jun 2020 • Perttu Hämäläinen, Martin Trapp, Tuure Saloheimo, Arno Solin
We propose Deep Residual Mixture Models (DRMMs), a novel deep generative model architecture.
2 code implementations • NeurIPS 2020 • Ari Heljakka, Yuxin Hou, Juho Kannala, Arno Solin
These networks can faithfully reproduce individual real-world input images like regular autoencoders, but also generate a fused sample from an arbitrary combination of several such images, allowing instantaneous 'style-mixing' and other new applications.
no code implementations • 6 Dec 2019 • Yuxin Hou, Ari Heljakka, Arno Solin
While frame-independent predictions with deep neural networks have become the prominent solutions to many computer vision tasks, the potential benefits of utilizing correlations between frames have received less attention.
no code implementations • ICML 2020 • Wessel P. Bruinsma, Eric Perim, Will Tebbutt, J. Scott Hosking, Arno Solin, Richard E. Turner
Multi-output Gaussian processes (MOGPs) leverage the flexibility and interpretability of GPs while capturing structure across outputs, which is desirable, for example, in spatio-temporal modelling.
no code implementations • pproximateinference AABI Symposium 2019 • William J. Wilkinson, Paul E. Chang, Michael Riis Andersen, Arno Solin
The extended Kalman filter (EKF) is a classical signal processing algorithm which performs efficient approximate Bayesian inference in non-conjugate models by linearising the local measurement function, avoiding the need to compute intractable integrals when calculating the posterior.
no code implementations • 2 Jun 2019 • Santiago Cortés Reina, Yuxin Hou, Juho Kannala, Arno Solin
Modern smartphones have all the sensing capabilities required for accurate and robust navigation and tracking.
1 code implementation • 12 Apr 2019 • Ari Heljakka, Arno Solin, Juho Kannala
retaining the identity of a face), sharp generated/reconstructed samples in high resolutions, and a well-structured latent space that supports semantic manipulation of the inputs.
1 code implementation • ICCV 2019 • Yuxin Hou, Juho Kannala, Arno Solin
The flexibility of the Gaussian process (GP) prior provides adapting memory for fusing information from previous views.
1 code implementation • 10 Apr 2019 • Arno Solin, Manon Kok
Gaussian processes (GPs) provide a powerful framework for extrapolation, interpolation, and noise removal in regression and classification.
4 code implementations • 9 Mar 2019 • Vikas Verma, Kenji Kawaguchi, Alex Lamb, Juho Kannala, Arno Solin, Yoshua Bengio, David Lopez-Paz
We introduce Interpolation Consistency Training (ICT), a simple and computation efficient algorithm for training Deep Neural Networks in the semi-supervised learning paradigm.
1 code implementation • 6 Feb 2019 • Yuxin Hou, Arno Solin, Juho Kannala
This paper presents a novel method, MaskMVS, to solve depth estimation for unstructured multi-view image-pose pairs.
1 code implementation • 31 Jan 2019 • William J. Wilkinson, Michael Riis Andersen, Joshua D. Reiss, Dan Stowell, Arno Solin
A typical audio signal processing pipeline includes multiple disjoint analysis stages, including calculation of a time-frequency representation followed by spectrogram-based feature analysis.
1 code implementation • NeurIPS 2018 • Arno Solin, James Hensman, Richard E. Turner
The complexity is still cubic in the state dimension $m$ which is an impediment to practical application.
1 code implementation • 6 Nov 2018 • William J. Wilkinson, Michael Riis Andersen, Joshua D. Reiss, Dan Stowell, Arno Solin
In audio signal processing, probabilistic time-frequency models have many benefits over their non-probabilistic counterparts.
1 code implementation • 10 Aug 2018 • Santiago Cortés, Arno Solin, Juho Kannala
Strapdown inertial navigation systems are sensitive to the quality of the data provided by the accelerometer and gyroscope.
1 code implementation • ECCV 2018 • Santiago Cortés, Arno Solin, Esa Rahtu, Juho Kannala
The lack of realistic and open benchmarking datasets for pedestrian visual-inertial odometry has made it hard to pinpoint differences in published methods.
1 code implementation • 9 Jul 2018 • Ari Heljakka, Arno Solin, Juho Kannala
Instead, we propose the Progressively Growing Generative Autoencoder (PIONEER) network which achieves high-quality reconstruction with $128{\times}128$ images without requiring a GAN discriminator.
1 code implementation • 31 May 2018 • Santiago Cortés Reina, Arno Solin, Juho Kannala
This application paper proposes a model for estimating the parameters on the fly by fusing gyroscope and camera data, both readily available in modern day smartphones.
no code implementations • 5 Apr 2018 • Manon Kok, Arno Solin
We present a method for scalable and fully 3D magnetic field simultaneous localisation and mapping (SLAM) using local anomalies in the magnetic field as a source of position information.
2 code implementations • 14 Feb 2018 • Ari Heljakka, Arno Solin, Juho Kannala
By treating the age phases as a sequence of image domains, we construct a chain of transformers that map images from one age domain to the next.
no code implementations • ICML 2018 • Hannes Nickisch, Arno Solin, Alexander Grigorievskiy
We provide a comprehensive overview and tooling for GP modeling with non-Gaussian likelihoods using state space methods.
no code implementations • 2 Aug 2017 • Arno Solin, Santiago Cortes, Esa Rahtu, Juho Kannala
This paper presents a novel method for visual-inertial odometry.
1 code implementation • 1 Mar 2017 • Arno Solin, Santiago Cortes, Esa Rahtu, Juho Kannala
Building a complete inertial navigation system using the limited quality data provided by current smartphones has been regarded challenging, if not impossible.
1 code implementation • 21 Nov 2016 • James Hensman, Nicolas Durrande, Arno Solin
This work brings together two powerful concepts in Gaussian processes: the variational approach to sparse approximation and the spectral representation of Gaussian processes.
no code implementations • 17 Apr 2016 • Arno Solin, Pasi Jylänki, Jaakko Kauramäki, Tom Heskes, Marcel A. J. van Gerven, Simo Särkkä
We apply the method to both simulated and empirical data, and demonstrate the efficiency and generality of our Bayesian source reconstruction approach which subsumes various classical approaches in the literature.
no code implementations • 15 Sep 2015 • Arno Solin, Manon Kok, Niklas Wahlström, Thomas B. Schön, Simo Särkkä
Anomalies in the ambient magnetic field can be used as features in indoor positioning and navigation.
no code implementations • 7 Jun 2015 • Andreas Svensson, Arno Solin, Simo Särkkä, Thomas B. Schön
We present a procedure for efficient Bayesian learning in Gaussian process state space models, where the representation is formed by projecting the problem onto a set of approximate eigenfunctions derived from the prior covariance structure.
1 code implementation • 23 Apr 2015 • Juho Kokkala, Arno Solin, Simo Särkkä
We consider approximate maximum likelihood parameter estimation in nonlinear state-space models.
Methodology Dynamical Systems Optimization and Control Computation
2 code implementations • 21 Jan 2014 • Arno Solin, Simo Särkkä
On this approximate eigenbasis the eigenvalues of the covariance function can be expressed as simple functions of the spectral density of the Gaussian process, which allows the GP inference to be solved under a computational cost scaling as $\mathcal{O}(nm^2)$ (initial) and $\mathcal{O}(m^3)$ (hyperparameter learning) with $m$ basis functions and $n$ data points.