Search Results for author: James Foster

Found 10 papers, 6 papers with code

Generative Modelling of Lévy Area for High Order SDE Simulation

no code implementations4 Aug 2023 Andraž Jelinčič, Jiajie Tao, William F. Turner, Thomas Cass, James Foster, Hao Ni

In this paper, we propose L\'{e}vyGAN, a deep-learning-based model for generating approximate samples of L\'{e}vy area conditional on a Brownian increment.

Efficient and Accurate Gradients for Neural SDEs

2 code implementations NeurIPS 2021 Patrick Kidger, James Foster, Xuechen Li, Terry Lyons

This reduces computational cost (giving up to a $1. 87\times$ speedup) and removes the numerical truncation errors associated with gradient penalty.

Realistic Differentially-Private Transmission Power Flow Data Release

1 code implementation25 Mar 2021 David Smith, Frederik Geth, Elliott Vercoe, Andrew Feutrill, Ming Ding, Jonathan Chan, James Foster, Thierry Rakotoarivelo

For the modeling, design and planning of future energy transmission networks, it is vital for stakeholders to access faithful and useful power flow data, while provably maintaining the privacy of business confidentiality of service providers.

Brownian bridge expansions for Lévy area approximations and particular values of the Riemann zeta function

no code implementations19 Feb 2021 James Foster, Karen Habermann

We study approximations for the L\'evy area of Brownian motion which are based on the Fourier series expansion and a polynomial expansion of the associated Brownian bridge.

Probability Numerical Analysis Numerical Analysis Number Theory 60F05, 60H35, 60J65, 41A10, 42A10, 11M06

Neural SDEs as Infinite-Dimensional GANs

1 code implementation6 Feb 2021 Patrick Kidger, James Foster, Xuechen Li, Harald Oberhauser, Terry Lyons

Stochastic differential equations (SDEs) are a staple of mathematical modelling of temporal dynamics.

Time Series Time Series Analysis

Neural SDEs Made Easy: SDEs are Infinite-Dimensional GANs

no code implementations1 Jan 2021 Patrick Kidger, James Foster, Xuechen Li, Harald Oberhauser, Terry Lyons

Several authors have introduced \emph{Neural Stochastic Differential Equations} (Neural SDEs), often involving complex theory with various limitations.

Neural CDEs for Long Time Series via the Log-ODE Method

no code implementations28 Sep 2020 James Morrill, Patrick Kidger, Cristopher Salvi, James Foster, Terry Lyons

Neural Controlled Differential Equations (Neural CDEs) are the continuous-time analogue of an RNN, just as Neural ODEs are analogous to ResNets.

Time Series Time Series Analysis

Neural Rough Differential Equations for Long Time Series

3 code implementations17 Sep 2020 James Morrill, Cristopher Salvi, Patrick Kidger, James Foster, Terry Lyons

Neural controlled differential equations (CDEs) are the continuous-time analogue of recurrent neural networks, as Neural ODEs are to residual networks, and offer a memory-efficient continuous-time way to model functions of potentially irregular time series.

Irregular Time Series Time Series +2

The Signature Kernel is the solution of a Goursat PDE

4 code implementations26 Jun 2020 Cristopher Salvi, Thomas Cass, James Foster, Terry Lyons, Weixin Yang

Recently, there has been an increased interest in the development of kernel methods for learning with sequential data.

Dimensionality Reduction Time Series Analysis +1

Neural Controlled Differential Equations for Irregular Time Series

5 code implementations NeurIPS 2020 Patrick Kidger, James Morrill, James Foster, Terry Lyons

The resulting \emph{neural controlled differential equation} model is directly applicable to the general setting of partially-observed irregularly-sampled multivariate time series, and (unlike previous work on this problem) it may utilise memory-efficient adjoint-based backpropagation even across observations.

Irregular Time Series Time Series +1

Cannot find the paper you are looking for? You can Submit a new open access paper.