Search Results for author: Diyora Salimova

Found 5 papers, 1 papers with code

Approximation properties of Residual Neural Networks for Kolmogorov PDEs

no code implementations30 Oct 2021 Jonas Baggenstos, Diyora Salimova

In this article we show that ResNets are able to approximate solutions of Kolmogorov partial differential equations (PDEs) with constant diffusion and possibly nonlinear drift coefficients without suffering the curse of dimensionality, which is to say the number of parameters of the approximating ResNets grows at most polynomially in the reciprocal of the approximation accuracy $\varepsilon > 0$ and the dimension of the considered PDE $d\in\mathbb{N}$.

Image Classification Math

Weak error analysis for stochastic gradient descent optimization algorithms

no code implementations3 Jul 2020 Aritz Bercher, Lukas Gonon, Arnulf Jentzen, Diyora Salimova

In applications one is often not only interested in the size of the error with respect to the objective function but also in the size of the error with respect to a test function which is possibly different from the objective function.

Face Recognition Fraud Detection

Space-time deep neural network approximations for high-dimensional partial differential equations

no code implementations3 Jun 2020 Fabian Hornung, Arnulf Jentzen, Diyora Salimova

Each of these results establishes that DNNs overcome the curse of dimensionality in approximating suitable PDE solutions at a fixed time point $T>0$ and on a compact cube $[a, b]^d$ in space but none of these results provides an answer to the question whether the entire PDE solution on $[0, T]\times [a, b]^d$ can be approximated by DNNs without the curse of dimensionality.

Vocal Bursts Intensity Prediction

Deep neural network approximations for Monte Carlo algorithms

1 code implementation28 Aug 2019 Philipp Grohs, Arnulf Jentzen, Diyora Salimova

One key argument in most of these results is, first, to use a Monte Carlo approximation scheme which can approximate the solution of the PDE under consideration at a fixed space-time point without the curse of dimensionality and, thereafter, to prove that DNNs are flexible enough to mimic the behaviour of the used approximation scheme.

A proof that deep artificial neural networks overcome the curse of dimensionality in the numerical approximation of Kolmogorov partial differential equations with constant diffusion and nonlinear drift coefficients

no code implementations19 Sep 2018 Arnulf Jentzen, Diyora Salimova, Timo Welti

These numerical simulations indicate that DNNs seem to possess the fundamental flexibility to overcome the curse of dimensionality in the sense that the number of real parameters used to describe the DNN grows at most polynomially in both the reciprocal of the prescribed approximation accuracy $ \varepsilon > 0 $ and the dimension $ d \in \mathbb{N}$ of the function which the DNN aims to approximate in such computational problems.

Face Recognition Fraud Detection

Cannot find the paper you are looking for? You can Submit a new open access paper.