SchNetPack is a versatile neural networks toolbox that addresses both the requirements of method development and application of atomistic machine learning.
In recent years, the prediction of quantum mechanical observables with machine learning methods has become increasingly popular.
The rational design of molecules with desired properties is a long-standing challenge in chemistry.
Machine-learned force fields (ML-FFs) combine the accuracy of ab initio methods with the efficiency of conventional force fields.
Message passing neural networks have become a method of choice for learning on graphs, in particular the prediction of chemical properties and the acceleration of molecular dynamics studies.
We employ FieldSchNet to study the influence of solvent effects on molecular spectra and a Claisen rearrangement reaction.
In recent years, the use of Machine Learning (ML) in computational chemistry has enabled numerous advances previously out of reach due to the computational complexity of traditional electronic-structure methods.
In this paper, we show that GNNs can in fact be naturally explained using higher-order expansions, i. e. by identifying groups of edges that jointly contribute to the prediction.
Here, we present a strategy to work around both obstacles, and demonstrate autonomous robotic nanofabrication by manipulating single molecules.
Deep learning has proven to yield fast and accurate predictions of quantum-chemical properties to accelerate the discovery of novel molecules and materials.
Deep Learning has been shown to learn efficient representations for structured data such as image, text or audio.
In this work, we extend the SchNet architecture by using weighted skip connections to assemble the final representation.
1 code implementation • 13 Aug 2018 • Maximilian Alber, Sebastian Lapuschkin, Philipp Seegerer, Miriam Hägele, Kristof T. Schütt, Grégoire Montavon, Wojciech Samek, Klaus-Robert Müller, Sven Dähne, Pieter-Jan Kindermans
The presented library iNNvestigate addresses this by providing a common interface and out-of-the- box implementation for many analysis methods, including the reference implementation for PatternNet and PatternAttribution as well as for LRP-methods.
With the rise of deep neural networks for quantum chemistry applications, there is a pressing need for architectures that, beyond delivering accurate predictions of chemical properties, are readily interpretable by researchers.
Deep learning has led to a paradigm shift in artificial intelligence, including web, text and image search, speech recognition, as well as bioinformatics, with growing impact in chemical physics.
Ranked #6 on Formation Energy on Materials Project
Formation Energy Chemical Physics Materials Science
Saliency methods aim to explain the predictions of deep neural networks.
Deep learning has the potential to revolutionize quantum chemistry as it is ideally suited to learn representations for structured data and speed up the exploration of chemical space.
Ranked #4 on Formation Energy on JARVIS-DFT
We show that these methods do not produce the theoretically correct explanation for a linear model.