no code implementations • 10 Dec 2021 • Raphael Leiteritz, Patrick Buchfink, Bernard Haasdonk, Dirk Pflüger
Neural networks can be used as surrogates for PDE models.
no code implementations • 15 May 2021 • Tizian Wenzel, Gabriele Santin, Bernard Haasdonk
In particular, we show that the use of special types of kernels yield models reminiscent of neural networks that are founded in the same theoretical framework of classical kernel methods, while enjoying many computational properties of deep neural networks.
no code implementations • 25 Mar 2021 • Tizian Wenzel, Marius Kurz, Andrea Beck, Gabriele Santin, Bernard Haasdonk
Standard kernel methods for machine learning usually struggle when dealing with large datasets.
no code implementations • 1 Dec 2020 • Bernard Haasdonk, Boumediene Hamzi, Gabriele Santin, Dominik Wittwar
We then use an apposite data-based kernel method to construct a suitable approximation of the manifold close to the equilibrium, which is compatible with our general error theory.
no code implementations • 27 Apr 2020 • Bernard Haasdonk, Tizian Wenzel, Gabriele Santin, Syn Schmitt
Greedy kernel approximation algorithms are successful techniques for sparse and accurate data-based modelling and function approximation.
1 code implementation • 11 Nov 2019 • Tizian Wenzel, Gabriele Santin, Bernard Haasdonk
Since the computation of an optimal selection of sampling points may be an infeasible task, one promising option is to use greedy methods.
Numerical Analysis Numerical Analysis
1 code implementation • 27 Sep 2019 • Roman Föll, Bernard Haasdonk, Markus Hanselmann, Holger Ulmer
In this paper we introduce several new Deep recurrent Gaussian process (DRGP) models based on the Sparse Spectrum Gaussian process (SSGP) and the improved version, called variational Sparse Spectrum Gaussian process (VSSGP).
1 code implementation • 24 Jul 2019 • Gabriele Santin, Bernard Haasdonk
Second, if a function is available only via measurements or a few function evaluation samples, kernel approximation techniques can provide function surrogates that allow global evaluation.
Numerical Analysis Numerical Analysis
1 code implementation • 25 Jul 2018 • Gabriele Santin, Dominik Wittwar, Bernard Haasdonk
Kernel based regularized interpolation is a well known technique to approximate a continuous multivariate function using a set of scattered data points and the corresponding function evaluations, or data values.
Numerical Analysis
1 code implementation • 2 Nov 2017 • Roman Föll, Bernard Haasdonk, Markus Hanselmann, Holger Ulmer
Modeling sequential data has become more and more important in practice.