Numerical integration is the task to calculate the numerical value of a definite integral or the numerical solution of differential equations.
In the absence of DPP machinery to derive an efficient sampler and analyze their estimator, the idea of Monte Carlo integration with DPPs was stored in the cellar of numerical integration.
Our multiscale hierarchical time-stepping scheme provides important advantages over current time-stepping algorithms, including (i) circumventing numerical stiffness due to disparate time-scales, (ii) improved accuracy in comparison with leading neural-network architectures, (iii) efficiency in long-time simulation/forecasting due to explicit training of slow time-scale dynamics, and (iv) a flexible framework that is parallelizable and may be integrated with standard numerical time-stepping algorithms.
That is why, despite the high computational cost, numerical integration is still the gold standard in many applications.
Integration over non-negative integrands is a central problem in machine learning (e. g. for model averaging, (hyper-)parameter marginalisation, and computing posterior predictive distributions).
Stochastic differential equations are an important modeling class in many disciplines.
We consider the problem of improving kernel approximation via randomized feature maps.
To this end, adaptive schemes have been developed that rely on error estimators based on Taylor series expansions.
We introduce the code i-flow, a python package that performs high-dimensional numerical integration utilizing normalizing flows.