no code implementations • 4 Mar 2024 • Joost A. A. Opschoor, Christoph Schwab
We analyze deep Neural Network emulation rates of smooth functions with point singularities in bounded, polytopal domains $\mathrm{D} \subset \mathbb{R}^d$, $d=2, 3$.
no code implementations • 12 Jan 2024 • Joost A. A. Opschoor, Christoph Schwab, Christos Xenophontos
We prove deep neural network (DNN for short) expressivity rate bounds for solution sets of a model class of singularly perturbed, elliptic two-point boundary value problems, in Sobolev norms, on the bounded interval $(-1, 1)$.
no code implementations • 11 Oct 2023 • Joost A. A. Opschoor, Christoph Schwab
Expression rates and stability in Sobolev norms of deep ReLU neural networks (NNs) in terms of the number of parameters defining the NN for continuous, piecewise polynomial functions, on arbitrary, finite partitions $\mathcal{T}$ of a bounded interval $(a, b)$ are addressed.
no code implementations • 19 Jul 2023 • Christoph Schwab, Andreas Stein, Jakob Zech
We establish universality and expression rate bounds for a class of neural Deep Operator Networks (DON) emulating Lipschitz (or H\"older) continuous maps $\mathcal G:\mathcal X\to\mathcal Y$ between (subsets of) separable Hilbert spaces $\mathcal X$, $\mathcal Y$.
no code implementations • 11 Jul 2022 • Lukas Herrmann, Christoph Schwab, Jakob Zech
Specifically, we study approximation rates for Deep Neural Operator and Generalized Polynomial Chaos (gpc) Operator surrogates for nonlinear, holomorphic maps between infinite-dimensional, separable Hilbert spaces.
no code implementations • 14 Jan 2022 • Marcello Longo, Joost A. A. Opschoor, Nico Disch, Christoph Schwab, Jakob Zech
Our construction and DNN architecture generalizes previous results in that no geometric restrictions on the regular simplicial partitions $\mathcal{T}$ of $\Omega$ are required for DNN emulation.
no code implementations • 15 Dec 2021 • Carlo Marcati, Christoph Schwab
We construct and analyze approximation rates of deep operator networks (ONets) between infinite-dimensional spaces that emulate with an exponential rate of convergence the coefficient-to-solution map of elliptic second-order partial differential equations.
no code implementations • 13 Nov 2021 • Christoph Schwab, Jakob Zech
For artificial deep neural networks, we prove expression rates for analytic functions $f:\mathbb{R}^d\to\mathbb{R}$ in the norm of $L^2(\mathbb{R}^d,\gamma_d)$ where $d\in {\mathbb{N}}\cup\{ \infty \}$.
no code implementations • 23 Feb 2021 • Lukas Gonon, Christoph Schwab
Deep neural networks (DNNs) with ReLU activation function are proved to be able to express viscosity solutions of linear partial integrodifferental equations (PIDEs) on state spaces of possibly high dimension $d$.
Numerical Analysis Numerical Analysis Probability
no code implementations • 23 Oct 2020 • Carlo Marcati, Joost A. A. Opschoor, Philipp C. Petersen, Christoph Schwab
We prove exponential expressivity with stable ReLU Neural Networks (ReLU NNs) in $H^1(\Omega)$ for weighted analytic function classes in certain polytopal domains $\Omega$, in space dimension $d=2, 3$.