We propose a novel framework for solving continuous-time non-Markovian stochastic control problems by means of neural rough differential equations (Neural RDEs) introduced in Morrill et al. (2021).
Motivated by the paradigm of reservoir computing, we consider randomly initialized controlled ResNets defined as Euler-discretizations of neural controlled differential equations (Neural CDEs), a unified architecture which enconpasses both RNNs and ResNets.
This article provides a concise overview of some of the recent advances in the application of rough path theory to machine learning.
On the other hand, it extends Neural Operators -- generalizations of neural networks to model mappings between spaces of functions -- in that it can parameterize solution operators of SPDEs depending simultaneously on the initial condition and a realization of the driving noise.
Stochastic processes are random variables with values in some space of paths.
Making predictions and quantifying their uncertainty when the input data is sequential is a fundamental learning challenge, recently attracting increasing attention.
Neural Controlled Differential Equations (Neural CDEs) are the continuous-time analogue of an RNN, just as Neural ODEs are analogous to ResNets.
Neural controlled differential equations (CDEs) are the continuous-time analogue of recurrent neural networks, as Neural ODEs are to residual networks, and offer a memory-efficient continuous-time way to model functions of potentially irregular time series.
Ranked #4 on Time Series Classification on EigenWorms
Recently, there has been an increased interest in the development of kernel methods for learning with sequential data.
In this paper, we develop a rigorous mathematical framework for distribution regression where inputs are complex data streams.
Mathematical models, calibrated to data, have become ubiquitous to make key decision processes in modern quantitative finance.