Search Results for author: David Holzmüller

Found 10 papers, 9 papers with code

Predicting Properties of Periodic Systems from Cluster Data: A Case Study of Liquid Water

1 code implementation3 Dec 2023 Viktor Zaverkin, David Holzmüller, Robin Schuldt, Johannes Kästner

The accuracy of the training data limits the accuracy of bulk properties from machine-learned potentials.

Uncertainty-biased molecular dynamics for learning uniformly accurate interatomic potentials

no code implementations3 Dec 2023 Viktor Zaverkin, David Holzmüller, Henrik Christiansen, Federico Errica, Francesco Alesiani, Makoto Takamoto, Mathias Niepert, Johannes Kästner

Existing biased and unbiased MD simulations, however, are prone to miss either rare events or extrapolative regions -- areas of the configurational space where unreliable predictions are made.

Active Learning

Mind the spikes: Benign overfitting of kernels and neural networks in fixed dimension

1 code implementation NeurIPS 2023 Moritz Haas, David Holzmüller, Ulrike Von Luxburg, Ingo Steinwart

In this paper, we show that the smoothness of the estimators, and not the dimension, is the key: benign overfitting is possible if and only if the estimator's derivatives are large enough.

regression

Convergence Rates for Non-Log-Concave Sampling and Log-Partition Estimation

1 code implementation6 Mar 2023 David Holzmüller, Francis Bach

Specifically, for $m$-times differentiable functions in $d$ dimensions, the optimal rate for algorithms with $n$ function evaluations is known to be $O(n^{-m/d})$, where the constant can potentially depend on $m, d$ and the function to be optimized.

Transfer learning for chemically accurate interatomic neural network potentials

1 code implementation7 Dec 2022 Viktor Zaverkin, David Holzmüller, Luca Bonfirraro, Johannes Kästner

This work studies the capability of transfer learning, in particular discriminative fine-tuning, for efficiently generating chemically accurate interatomic neural network potentials on organic molecules from the MD17 and ANI data sets.

Atomic Forces Transfer Learning

A Framework and Benchmark for Deep Batch Active Learning for Regression

2 code implementations17 Mar 2022 David Holzmüller, Viktor Zaverkin, Johannes Kästner, Ingo Steinwart

We provide open-source code that includes efficient implementations of all kernels, kernel transformations, and selection methods, and can be used for reproducing our results.

Active Learning regression +1

Fast and Sample-Efficient Interatomic Neural Network Potentials for Molecules and Materials Based on Gaussian Moments

1 code implementation20 Sep 2021 Viktor Zaverkin, David Holzmüller, Ingo Steinwart, Johannes Kästner

Artificial neural networks (NNs) are one of the most frequently used machine learning approaches to construct interatomic potentials and enable efficient large-scale atomistic simulations with almost ab initio accuracy.

Active Learning

On the Universality of the Double Descent Peak in Ridgeless Regression

1 code implementation ICLR 2021 David Holzmüller

We prove a non-asymptotic distribution-independent lower bound for the expected mean squared generalization error caused by label noise in ridgeless linear regression.

regression

Training Two-Layer ReLU Networks with Gradient Descent is Inconsistent

1 code implementation12 Feb 2020 David Holzmüller, Ingo Steinwart

We prove that two-layer (Leaky)ReLU networks initialized by e. g. the widely used method proposed by He et al. (2015) and trained using gradient descent on a least-squares loss are not universally consistent.

Vocal Bursts Valence Prediction

Efficient Neighbor-Finding on Space-Filling Curves

2 code implementations17 Oct 2017 David Holzmüller

In this thesis, we show how neighbors on many regular grids ordered by space-filling curves can be found in an average-case time complexity of $O(1)$.

Computational Geometry Data Structures and Algorithms Performance

Cannot find the paper you are looking for? You can Submit a new open access paper.