Search Results for author: Christoph Schwab

Found 10 papers, 0 papers with code

Exponential Expressivity of ReLU$^k$ Neural Networks on Gevrey Classes with Point Singularities

no code implementations4 Mar 2024 Joost A. A. Opschoor, Christoph Schwab

We analyze deep Neural Network emulation rates of smooth functions with point singularities in bounded, polytopal domains $\mathrm{D} \subset \mathbb{R}^d$, $d=2, 3$.

Neural Networks for Singular Perturbations

no code implementations12 Jan 2024 Joost A. A. Opschoor, Christoph Schwab, Christos Xenophontos

We prove deep neural network (DNN for short) expressivity rate bounds for solution sets of a model class of singularly perturbed, elliptic two-point boundary value problems, in Sobolev norms, on the bounded interval $(-1, 1)$.

Deep ReLU networks and high-order finite element methods II: Chebyshev emulation

no code implementations11 Oct 2023 Joost A. A. Opschoor, Christoph Schwab

Expression rates and stability in Sobolev norms of deep ReLU neural networks (NNs) in terms of the number of parameters defining the NN for continuous, piecewise polynomial functions, on arbitrary, finite partitions $\mathcal{T}$ of a bounded interval $(a, b)$ are addressed.

Deep Operator Network Approximation Rates for Lipschitz Operators

no code implementations19 Jul 2023 Christoph Schwab, Andreas Stein, Jakob Zech

We establish universality and expression rate bounds for a class of neural Deep Operator Networks (DON) emulating Lipschitz (or H\"older) continuous maps $\mathcal G:\mathcal X\to\mathcal Y$ between (subsets of) separable Hilbert spaces $\mathcal X$, $\mathcal Y$.

Neural and spectral operator surrogates: unified construction and expression rate bounds

no code implementations11 Jul 2022 Lukas Herrmann, Christoph Schwab, Jakob Zech

Specifically, we study approximation rates for Deep Neural Operator and Generalized Polynomial Chaos (gpc) Operator surrogates for nonlinear, holomorphic maps between infinite-dimensional, separable Hilbert spaces.

De Rham compatible Deep Neural Network FEM

no code implementations14 Jan 2022 Marcello Longo, Joost A. A. Opschoor, Nico Disch, Christoph Schwab, Jakob Zech

Our construction and DNN architecture generalizes previous results in that no geometric restrictions on the regular simplicial partitions $\mathcal{T}$ of $\Omega$ are required for DNN emulation.

Exponential Convergence of Deep Operator Networks for Elliptic Partial Differential Equations

no code implementations15 Dec 2021 Carlo Marcati, Christoph Schwab

We construct and analyze approximation rates of deep operator networks (ONets) between infinite-dimensional spaces that emulate with an exponential rate of convergence the coefficient-to-solution map of elliptic second-order partial differential equations.

Deep Learning in High Dimension: Neural Network Approximation of Analytic Functions in $L^2(\mathbb{R}^d,γ_d)$

no code implementations13 Nov 2021 Christoph Schwab, Jakob Zech

For artificial deep neural networks, we prove expression rates for analytic functions $f:\mathbb{R}^d\to\mathbb{R}$ in the norm of $L^2(\mathbb{R}^d,\gamma_d)$ where $d\in {\mathbb{N}}\cup\{ \infty \}$.

Deep ReLU neural networks overcome the curse of dimensionality for partial integrodifferential equations

no code implementations23 Feb 2021 Lukas Gonon, Christoph Schwab

Deep neural networks (DNNs) with ReLU activation function are proved to be able to express viscosity solutions of linear partial integrodifferental equations (PIDEs) on state spaces of possibly high dimension $d$.

Numerical Analysis Numerical Analysis Probability

Exponential ReLU Neural Network Approximation Rates for Point and Edge Singularities

no code implementations23 Oct 2020 Carlo Marcati, Joost A. A. Opschoor, Philipp C. Petersen, Christoph Schwab

We prove exponential expressivity with stable ReLU Neural Networks (ReLU NNs) in $H^1(\Omega)$ for weighted analytic function classes in certain polytopal domains $\Omega$, in space dimension $d=2, 3$.

Cannot find the paper you are looking for? You can Submit a new open access paper.