Search Results for author: Samuel Lanthaler

Found 8 papers, 1 papers with code

Discretization Error of Fourier Neural Operators

no code implementations3 May 2024 Samuel Lanthaler, Andrew M. Stuart, Margaret Trautner

Operator learning is a variant of machine learning that is designed to approximate maps between function spaces from data.

Operator learning

Operator Learning: Algorithms and Analysis

no code implementations24 Feb 2024 Nikola B. Kovachki, Samuel Lanthaler, Andrew M. Stuart

This review article summarizes recent progress and the current state of our theoretical understanding of neural operators, focusing on an approximation theoretic point of view.

Model Discovery Operator learning

The Parametric Complexity of Operator Learning

no code implementations28 Jun 2023 Samuel Lanthaler, Andrew M. Stuart

The first contribution of this paper is to prove that for general classes of operators which are characterized only by their $C^r$- or Lipschitz-regularity, operator learning suffers from a ``curse of parametric complexity'', which is an infinite-dimensional analogue of the well-known curse of dimensionality encountered in high-dimensional approximation problems.

Operator learning

Error Bounds for Learning with Vector-Valued Random Features

1 code implementation NeurIPS 2023 Samuel Lanthaler, Nicholas H. Nelsen

This paper provides a comprehensive error analysis of learning with vector-valued random features (RF).

regression

The Nonlocal Neural Operator: Universal Approximation

no code implementations26 Apr 2023 Samuel Lanthaler, Zongyi Li, Andrew M. Stuart

A popular variant of neural operators is the Fourier neural operator (FNO).

Operator learning

Operator learning with PCA-Net: upper and lower complexity bounds

no code implementations28 Mar 2023 Samuel Lanthaler

Then, two potential obstacles to efficient operator learning with PCA-Net are identified, and made precise through lower complexity bounds; the first relates to the complexity of the output distribution, measured by a slow decay of the PCA eigenvalues.

Operator learning

On the approximation of functions by tanh neural networks

no code implementations18 Apr 2021 Tim De Ryck, Samuel Lanthaler, Siddhartha Mishra

We derive bounds on the error, in high-order Sobolev norms, incurred in the approximation of Sobolev-regular as well as analytic functions by neural networks with the hyperbolic tangent activation function.

Cannot find the paper you are looking for? You can Submit a new open access paper.