no code implementations • 3 Dec 2023 • Viktor Zaverkin, David Holzmüller, Henrik Christiansen, Federico Errica, Francesco Alesiani, Makoto Takamoto, Mathias Niepert, Johannes Kästner

Existing biased and unbiased MD simulations, however, are prone to miss either rare events or extrapolative regions -- areas of the configurational space where unreliable predictions are made.

1 code implementation • 3 Dec 2023 • Viktor Zaverkin, David Holzmüller, Robin Schuldt, Johannes Kästner

The accuracy of the training data limits the accuracy of bulk properties from machine-learned potentials.

1 code implementation • NeurIPS 2023 • Moritz Haas, David Holzmüller, Ulrike Von Luxburg, Ingo Steinwart

In this paper, we show that the smoothness of the estimators, and not the dimension, is the key: benign overfitting is possible if and only if the estimator's derivatives are large enough.

1 code implementation • 6 Mar 2023 • David Holzmüller, Francis Bach

Specifically, for $m$-times differentiable functions in $d$ dimensions, the optimal rate for algorithms with $n$ function evaluations is known to be $O(n^{-m/d})$, where the constant can potentially depend on $m, d$ and the function to be optimized.

1 code implementation • 7 Dec 2022 • Viktor Zaverkin, David Holzmüller, Luca Bonfirraro, Johannes Kästner

This work studies the capability of transfer learning, in particular discriminative fine-tuning, for efficiently generating chemically accurate interatomic neural network potentials on organic molecules from the MD17 and ANI data sets.

2 code implementations • 17 Mar 2022 • David Holzmüller, Viktor Zaverkin, Johannes Kästner, Ingo Steinwart

We provide open-source code that includes efficient implementations of all kernels, kernel transformations, and selection methods, and can be used for reproducing our results.

1 code implementation • 20 Sep 2021 • Viktor Zaverkin, David Holzmüller, Ingo Steinwart, Johannes Kästner

Artificial neural networks (NNs) are one of the most frequently used machine learning approaches to construct interatomic potentials and enable efficient large-scale atomistic simulations with almost ab initio accuracy.

1 code implementation • ICLR 2021 • David Holzmüller

We prove a non-asymptotic distribution-independent lower bound for the expected mean squared generalization error caused by label noise in ridgeless linear regression.

1 code implementation • 12 Feb 2020 • David Holzmüller, Ingo Steinwart

We prove that two-layer (Leaky)ReLU networks initialized by e. g. the widely used method proposed by He et al. (2015) and trained using gradient descent on a least-squares loss are not universally consistent.

2 code implementations • 17 Oct 2017 • David Holzmüller

In this thesis, we show how neighbors on many regular grids ordered by space-filling curves can be found in an average-case time complexity of $O(1)$.

Computational Geometry Data Structures and Algorithms Performance

Cannot find the paper you are looking for? You can
Submit a new open access paper.

Contact us on:
hello@paperswithcode.com
.
Papers With Code is a free resource with all data licensed under CC-BY-SA.