Search Results for author: Harald Oberhauser

Found 19 papers, 11 papers with code

Kernelized Cumulants: Beyond Kernel Mean Embeddings

1 code implementation29 Jan 2023 Patric Bonnier, Harald Oberhauser, Zoltán Szabó

In $\mathbb R^d$, it is well-known that cumulants provide an alternative to moments that can achieve the same goals with numerous benefits such as lower variance estimators.

SOBER: Scalable Batch Bayesian Optimization and Quadrature using Recombination Constraints

no code implementations27 Jan 2023 Masaki Adachi, Satoshi Hayakawa, Saad Hamid, Martin Jørgensen, Harald Oberhauser, Micheal A. Osborne

Batch Bayesian optimisation (BO) has shown to be a sample-efficient method of performing optimisation where expensive-to-evaluate objective functions can be queried in parallel.

Bayesian Optimisation Drug Discovery

Sampling-based Nyström Approximation and Kernel Quadrature

no code implementations23 Jan 2023 Satoshi Hayakawa, Harald Oberhauser, Terry Lyons

We analyze the Nystr\"om approximation of a positive definite kernel associated with a probability measure.

Learning Theory

Fast Bayesian Inference with Batch Bayesian Quadrature via Kernel Recombination

2 code implementations9 Jun 2022 Masaki Adachi, Satoshi Hayakawa, Martin Jørgensen, Harald Oberhauser, Michael A. Osborne

Empirically, we find that our approach significantly outperforms the sampling efficiency of both state-of-the-art BQ techniques and Nested Sampling in various real-world datasets, including lithium-ion battery analytics.

Bayesian Inference Numerical Integration

Capturing Graphs with Hypo-Elliptic Diffusions

1 code implementation27 May 2022 Csaba Toth, Darrick Lee, Celia Hacker, Harald Oberhauser

This results in a novel tensor-valued graph operator, which we call the hypo-elliptic graph Laplacian.

Tangent Space and Dimension Estimation with the Wasserstein Distance

no code implementations12 Oct 2021 Uzu Lim, Harald Oberhauser, Vidit Nanda

Consider a set of points sampled independently near a smooth compact submanifold of Euclidean space.

Neural SDEs as Infinite-Dimensional GANs

1 code implementation6 Feb 2021 Patrick Kidger, James Foster, Xuechen Li, Harald Oberhauser, Terry Lyons

Stochastic differential equations (SDEs) are a staple of mathematical modelling of temporal dynamics.

Time Series Analysis

Nonlinear Independent Component Analysis for Discrete-Time and Continuous-Time Signals

1 code implementation4 Feb 2021 Alexander Schell, Harald Oberhauser

We study the classical problem of recovering a multidimensional source signal from observations of nonlinear mixtures of this signal.

Contrastive Learning Time Series Analysis

Neural SDEs Made Easy: SDEs are Infinite-Dimensional GANs

no code implementations1 Jan 2021 Patrick Kidger, James Foster, Xuechen Li, Harald Oberhauser, Terry Lyons

Several authors have introduced \emph{Neural Stochastic Differential Equations} (Neural SDEs), often involving complex theory with various limitations.

Seq2Tens: An Efficient Representation of Sequences by Low-Rank Tensor Projections

1 code implementation ICLR 2021 Csaba Toth, Patric Bonnier, Harald Oberhauser

Sequential data such as time series, video, or text can be challenging to analyse as the ordered structure gives rise to complex dependencies.

Imputation Time Series Analysis +1

Carathéodory Sampling for Stochastic Gradient Descent

1 code implementation2 Jun 2020 Francesco Cosentino, Harald Oberhauser, Alessandro Abate

Various flavours of Stochastic Gradient Descent (SGD) replace the expensive summation that computes the full gradient by approximating it with a small sum over a randomly selected subsample of the data set that in turn suffers from a high variance.

A Randomized Algorithm to Reduce the Support of Discrete Measures

1 code implementation NeurIPS 2020 Francesco Cosentino, Harald Oberhauser, Alessandro Abate

Given a discrete probability measure supported on $N$ atoms and a set of $n$ real-valued functions, there exists a probability measure that is supported on a subset of $n+1$ of the original $N$ atoms and has the same mean when integrated against each of the $n$ functions.

Bayesian Learning from Sequential Data using Gaussian Processes with Signature Covariances

1 code implementation ICML 2020 Csaba Toth, Harald Oberhauser

We develop a Bayesian approach to learning from sequential data by using Gaussian processes (GPs) with so-called signature kernels as covariance functions.

Gaussian Processes General Classification +2

Signature moments to characterize laws of stochastic processes

1 code implementation25 Oct 2018 Ilya Chevyrev, Harald Oberhauser

This allows us to derive a metric of maximum mean discrepancy type for laws of stochastic processes and study the topology it induces on the space of laws of stochastic processes.

Sketching the order of events

no code implementations31 Aug 2017 Terry Lyons, Harald Oberhauser

We introduce features for massive data streams.

Kernels for sequentially ordered data

no code implementations29 Jan 2016 Franz J. Király, Harald Oberhauser

We present a novel framework for kernel learning with sequential data of any kind, such as time series, sequences of graphs, or strings.

Time Series Analysis

Cannot find the paper you are looking for? You can Submit a new open access paper.