Search Results for author: Lukas Gonon

Found 20 papers, 2 papers with code

Approximation Rates for Deep Calibration of (Rough) Stochastic Volatility Models

no code implementations26 Sep 2023 Francesca Biagini, Lukas Gonon, Niklas Walter

We derive quantitative error bounds for deep neural networks (DNNs) approximating option prices on a $d$-dimensional risky asset as functions of the underlying model parameters, payoff parameters and initial conditions.

Universal Approximation Theorem and error bounds for quantum neural networks and quantum reservoirs

no code implementations24 Jul 2023 Lukas Gonon, Antoine Jacquier

Universal approximation theorems are the foundations of classical neural networks, providing theoretical guarantees that the latter are able to approximate maps of interest.

Infinite-dimensional reservoir computing

no code implementations2 Apr 2023 Lukas Gonon, Lyudmila Grigoryeva, Juan-Pablo Ortega

The reservoir architectures used for the approximation and estimation of elements in the new class are randomly generated echo state networks with either linear or ReLU activation functions.

Generalization Bounds

The necessity of depth for artificial neural networks to approximate certain classes of smooth and bounded functions without the curse of dimensionality

no code implementations19 Jan 2023 Lukas Gonon, Robin Graeber, Arnulf Jentzen

In particular, it is a key contribution of this work to reveal that for all $a, b\in\mathbb{R}$ with $b-a\geq 7$ we have that the functions $[a, b]^d\ni x=(x_1,\dots, x_d)\mapsto\prod_{i=1}^d x_i\in\mathbb{R}$ for $d\in\mathbb{N}$ as well as the functions $[a, b]^d\ni x =(x_1,\dots, x_d)\mapsto\sin(\prod_{i=1}^d x_i) \in \mathbb{R} $ for $ d \in \mathbb{N} $ can neither be approximated without the curse of dimensionality by means of shallow ANNs nor insufficiently deep ANNs with ReLU activation but can be approximated without the curse of dimensionality by sufficiently deep ANNs with ReLU activation.

Reservoir kernels and Volterra series

1 code implementation30 Dec 2022 Lukas Gonon, Lyudmila Grigoryeva, Juan-Pablo Ortega

A universal kernel is constructed whose sections approximate any causal and time-invariant filter in the fading memory category with inputs and outputs in a finite-dimensional Euclidean space.

Deep neural network expressivity for optimal stopping problems

no code implementations19 Oct 2022 Lukas Gonon

This article studies deep neural network expression rates for optimal stopping problems of discrete-time Markov processes on high-dimensional state spaces.

Detecting asset price bubbles using deep learning

no code implementations4 Oct 2022 Francesca Biagini, Lukas Gonon, Andrea Mazzon, Thilo Meyer-Brandis

In this paper we employ deep learning techniques to detect financial asset bubbles by using observed call option prices.

Expressive Power of Randomized Signature

no code implementations NeurIPS Workshop DLDE 2021 Christa Cuchiero, Lukas Gonon, Lyudmila Grigoryeva, Juan-Pablo Ortega, Josef Teichmann

We consider the question whether the time evolution of controlled differential equations on general state spaces can be arbitrarily well approximated by (regularized) regressions on features generated themselves through randomly chosen dynamical systems of moderately high dimension.

LEMMA Transfer Learning

Neural network approximation for superhedging prices

no code implementations29 Jul 2021 Francesca Biagini, Lukas Gonon, Thomas Reitsam

First we prove that the $\alpha$-quantile hedging price converges to the superhedging price at time $0$ for $\alpha$ tending to $1$, and show that the $\alpha$-quantile hedging price can be approximated by a neural network-based price.

Random feature neural networks learn Black-Scholes type PDEs without curse of dimensionality

no code implementations14 Jun 2021 Lukas Gonon

We derive bounds for the prediction error of random neural networks for learning sufficiently non-degenerate Black-Scholes type models.

Deep ReLU neural networks overcome the curse of dimensionality for partial integrodifferential equations

no code implementations23 Feb 2021 Lukas Gonon, Christoph Schwab

Deep neural networks (DNNs) with ReLU activation function are proved to be able to express viscosity solutions of linear partial integrodifferental equations (PIDEs) on state spaces of possibly high dimension $d$.

Numerical Analysis Numerical Analysis Probability

Fading memory echo state networks are universal

no code implementations22 Oct 2020 Lukas Gonon, Juan-Pablo Ortega

Echo state networks (ESNs) have been recently proved to be universal approximants for input/output systems with respect to various $L ^p$-type criteria.

Discrete-time signatures and randomness in reservoir computing

no code implementations17 Sep 2020 Christa Cuchiero, Lukas Gonon, Lyudmila Grigoryeva, Juan-Pablo Ortega, Josef Teichmann

A new explanation of geometric nature of the reservoir computing phenomenon is presented.

Weak error analysis for stochastic gradient descent optimization algorithms

no code implementations3 Jul 2020 Aritz Bercher, Lukas Gonon, Arnulf Jentzen, Diyora Salimova

In applications one is often not only interested in the size of the error with respect to the objective function but also in the size of the error with respect to a test function which is possibly different from the objective function.

Face Recognition Fraud Detection

Memory and forecasting capacities of nonlinear recurrent networks

no code implementations22 Apr 2020 Lukas Gonon, Lyudmila Grigoryeva, Juan-Pablo Ortega

The notion of memory capacity, originally introduced for echo state and linear networks with independent inputs, is generalized to nonlinear recurrent networks with stationary but dependent inputs.

Time Series Time Series Analysis

Approximation Bounds for Random Neural Networks and Reservoir Systems

no code implementations14 Feb 2020 Lukas Gonon, Lyudmila Grigoryeva, Juan-Pablo Ortega

This work studies approximation based on single-hidden-layer feedforward and recurrent neural networks with randomly generated internal weights.

Uniform error estimates for artificial neural network approximations for heat equations

no code implementations20 Nov 2019 Lukas Gonon, Philipp Grohs, Arnulf Jentzen, David Kofler, David Šiška

These mathematical results from the scientific literature prove in part that algorithms based on ANNs are capable of overcoming the curse of dimensionality in the numerical approximation of high-dimensional PDEs.

Risk bounds for reservoir computing

no code implementations30 Oct 2019 Lukas Gonon, Lyudmila Grigoryeva, Juan-Pablo Ortega

We analyze the practices of reservoir computing in the framework of statistical learning theory.

Learning Theory

Reservoir Computing Universality With Stochastic Inputs

no code implementations7 Jul 2018 Lukas Gonon, Juan-Pablo Ortega

The universal approximation properties with respect to $L ^p $-type criteria of three important families of reservoir computers with stochastic discrete-time semi-infinite inputs is shown.

Deep Hedging

3 code implementations8 Feb 2018 Hans Bühler, Lukas Gonon, Josef Teichmann, Ben Wood

We present a framework for hedging a portfolio of derivatives in the presence of market frictions such as transaction costs, market impact, liquidity constraints or risk limits using modern deep reinforcement machine learning methods.

Computational Finance Numerical Analysis Optimization and Control Probability Risk Management 91G60, 65K99

Cannot find the paper you are looking for? You can Submit a new open access paper.