Search Results for author: Samuel Lanthaler

Found 10 papers, 1 papers with code

Operator Learning of Lipschitz Operators: An Information-Theoretic Perspective

no code implementations26 Jun 2024 Samuel Lanthaler

This work addresses the parametric complexity of neural operator approximations for the general class of Lipschitz continuous operators.

Operator learning

Data Complexity Estimates for Operator Learning

no code implementations25 May 2024 Nikola B. Kovachki, Samuel Lanthaler, Hrushikesh Mhaskar

The second contribution of this work is to show that ``parametric efficiency'' implies ``data efficiency''; using the Fourier neural operator (FNO) as a case study, we show rigorously that on a narrower class of operators, efficiently approximated by FNO in terms of the number of tunable parameters, efficient operator learning is attainable in data complexity as well.

Operator learning

Discretization Error of Fourier Neural Operators

no code implementations3 May 2024 Samuel Lanthaler, Andrew M. Stuart, Margaret Trautner

Operator learning is a variant of machine learning that is designed to approximate maps between function spaces from data.

Operator learning

Operator Learning: Algorithms and Analysis

no code implementations24 Feb 2024 Nikola B. Kovachki, Samuel Lanthaler, Andrew M. Stuart

This review article summarizes recent progress and the current state of our theoretical understanding of neural operators, focusing on an approximation theoretic point of view.

Model Discovery Operator learning

The Parametric Complexity of Operator Learning

no code implementations28 Jun 2023 Samuel Lanthaler, Andrew M. Stuart

The first contribution of this paper is to prove that for general classes of operators which are characterized only by their $C^r$- or Lipschitz-regularity, operator learning suffers from a ``curse of parametric complexity'', which is an infinite-dimensional analogue of the well-known curse of dimensionality encountered in high-dimensional approximation problems.

Operator learning

Error Bounds for Learning with Vector-Valued Random Features

2 code implementations NeurIPS 2023 Samuel Lanthaler, Nicholas H. Nelsen

This paper provides a comprehensive error analysis of learning with vector-valued random features (RF).

regression

Operator learning with PCA-Net: upper and lower complexity bounds

no code implementations28 Mar 2023 Samuel Lanthaler

Then, two potential obstacles to efficient operator learning with PCA-Net are identified, and made precise through lower complexity bounds; the first relates to the complexity of the output distribution, measured by a slow decay of the PCA eigenvalues.

Operator learning

On the approximation of functions by tanh neural networks

no code implementations18 Apr 2021 Tim De Ryck, Samuel Lanthaler, Siddhartha Mishra

We derive bounds on the error, in high-order Sobolev norms, incurred in the approximation of Sobolev-regular as well as analytic functions by neural networks with the hyperbolic tangent activation function.

Cannot find the paper you are looking for? You can Submit a new open access paper.