Search Results for author: Johannes Kästner

Found 8 papers, 6 papers with code

ZnTrack -- Data as Code

1 code implementation19 Jan 2024 Fabian Zills, Moritz Schäfer, Samuel Tovey, Johannes Kästner, Christian Holm

The past decade has seen tremendous breakthroughs in computation and there is no indication that this will slow any time soon.


Uncertainty-biased molecular dynamics for learning uniformly accurate interatomic potentials

no code implementations3 Dec 2023 Viktor Zaverkin, David Holzmüller, Henrik Christiansen, Federico Errica, Francesco Alesiani, Makoto Takamoto, Mathias Niepert, Johannes Kästner

Existing biased and unbiased MD simulations, however, are prone to miss either rare events or extrapolative regions -- areas of the configurational space where unreliable predictions are made.

Active Learning

Thermally Averaged Magnetic Anisotropy Tensors via Machine Learning Based on Gaussian Moments

1 code implementation3 Dec 2023 Viktor Zaverkin, Julia Netz, Fabian Zills, Andreas Köhn, Johannes Kästner

We propose a machine learning method to model molecular tensorial quantities, namely the magnetic anisotropy tensor, based on the Gaussian-moment neural-network approach.

Predicting Properties of Periodic Systems from Cluster Data: A Case Study of Liquid Water

1 code implementation3 Dec 2023 Viktor Zaverkin, David Holzmüller, Robin Schuldt, Johannes Kästner

The accuracy of the training data limits the accuracy of bulk properties from machine-learned potentials.

Transfer learning for chemically accurate interatomic neural network potentials

1 code implementation7 Dec 2022 Viktor Zaverkin, David Holzmüller, Luca Bonfirraro, Johannes Kästner

This work studies the capability of transfer learning, in particular discriminative fine-tuning, for efficiently generating chemically accurate interatomic neural network potentials on organic molecules from the MD17 and ANI data sets.

Atomic Forces Transfer Learning

A Framework and Benchmark for Deep Batch Active Learning for Regression

2 code implementations17 Mar 2022 David Holzmüller, Viktor Zaverkin, Johannes Kästner, Ingo Steinwart

We provide open-source code that includes efficient implementations of all kernels, kernel transformations, and selection methods, and can be used for reproducing our results.

Active Learning regression +1

Fast and Sample-Efficient Interatomic Neural Network Potentials for Molecules and Materials Based on Gaussian Moments

1 code implementation20 Sep 2021 Viktor Zaverkin, David Holzmüller, Ingo Steinwart, Johannes Kästner

Artificial neural networks (NNs) are one of the most frequently used machine learning approaches to construct interatomic potentials and enable efficient large-scale atomistic simulations with almost ab initio accuracy.

Active Learning

Gaussian Moments as Physically Inspired Molecular Descriptors for Accurate and Scalable Machine Learning Potentials

no code implementations15 Sep 2021 Viktor Zaverkin, Johannes Kästner

Machine learning techniques allow a direct mapping of atomic positions and nuclear charges to the potential energy surface with almost ab-initio accuracy and the computational efficiency of empirical potentials.

BIG-bench Machine Learning Computational Efficiency

Cannot find the paper you are looking for? You can Submit a new open access paper.