no code implementations • 3 Feb 2024 • Duy M. H. Nguyen, Nina Lukashina, Tai Nguyen, An T. Le, TrungTin Nguyen, Nhat Ho, Jan Peters, Daniel Sonntag, Viktor Zaverkin, Mathias Niepert
Contrary to prior work, we propose a novel 2D--3D aggregation mechanism based on a differentiable solver for the \emph{Fused Gromov-Wasserstein Barycenter} problem and the use of an efficient online conformer generation method based on distance geometry.
1 code implementation • 27 Dec 2023 • Federico Errica, Henrik Christiansen, Viktor Zaverkin, Takashi Maruyama, Mathias Niepert, Francesco Alesiani
Long-range interactions are essential for the correct description of complex systems in many scientific fields.
no code implementations • 3 Dec 2023 • Viktor Zaverkin, David Holzmüller, Henrik Christiansen, Federico Errica, Francesco Alesiani, Makoto Takamoto, Mathias Niepert, Johannes Kästner
Existing biased and unbiased MD simulations, however, are prone to miss either rare events or extrapolative regions -- areas of the configurational space where unreliable predictions are made.
1 code implementation • 3 Dec 2023 • Viktor Zaverkin, Julia Netz, Fabian Zills, Andreas Köhn, Johannes Kästner
We propose a machine learning method to model molecular tensorial quantities, namely the magnetic anisotropy tensor, based on the Gaussian-moment neural-network approach.
1 code implementation • 3 Dec 2023 • Viktor Zaverkin, David Holzmüller, Robin Schuldt, Johannes Kästner
The accuracy of the training data limits the accuracy of bulk properties from machine-learned potentials.
1 code implementation • 7 Dec 2022 • Viktor Zaverkin, David Holzmüller, Luca Bonfirraro, Johannes Kästner
This work studies the capability of transfer learning, in particular discriminative fine-tuning, for efficiently generating chemically accurate interatomic neural network potentials on organic molecules from the MD17 and ANI data sets.
2 code implementations • 17 Mar 2022 • David Holzmüller, Viktor Zaverkin, Johannes Kästner, Ingo Steinwart
We provide open-source code that includes efficient implementations of all kernels, kernel transformations, and selection methods, and can be used for reproducing our results.
1 code implementation • 20 Sep 2021 • Viktor Zaverkin, David Holzmüller, Ingo Steinwart, Johannes Kästner
Artificial neural networks (NNs) are one of the most frequently used machine learning approaches to construct interatomic potentials and enable efficient large-scale atomistic simulations with almost ab initio accuracy.
no code implementations • 15 Sep 2021 • Viktor Zaverkin, Johannes Kästner
Machine learning techniques allow a direct mapping of atomic positions and nuclear charges to the potential energy surface with almost ab-initio accuracy and the computational efficiency of empirical potentials.