Search Results for author: Thomas F. Miller III

Found 10 papers, 4 papers with code

Molecular-orbital-based Machine Learning for Open-shell and Multi-reference Systems with Kernel Addition Gaussian Process Regression

1 code implementation17 Jul 2022 Lixue Cheng, Jiace Sun, J. Emiliano Deustua, Vignesh C. Bhethanabotla, Thomas F. Miller III

We introduce a novel machine learning strategy, kernel addition Gaussian process regression (KA-GPR), in molecular-orbital-based machine learning (MOB-ML) to learn the total correlation energies of general electronic structure theories for closed- and open-shell systems by introducing a machine learning strategy.

BIG-bench Machine Learning GPR

Molecular Dipole Moment Learning via Rotationally Equivariant Gaussian Process Regression with Derivatives in Molecular-orbital-based Machine Learning

2 code implementations31 May 2022 Jiace Sun, Lixue Cheng, Thomas F. Miller III

To demonstrate the ability of MOB-ML to function as generalized density-matrix functionals for molecular dipole moments and energies of organic molecules, we further apply the proposed MOB-ML approach to train and test the molecules from the QM9 dataset.

GPR

Accurate Molecular-Orbital-Based Machine Learning Energies via Unsupervised Clustering of Chemical Space

no code implementations21 Apr 2022 Lixue Cheng, Jiace Sun, Thomas F. Miller III

The resulting clusters from supervised or unsupervised clustering is further combined with scalable Gaussian process regression (GPR) or linear regression (LR) to learn molecular energies accurately by generating a local regression model in each cluster.

BIG-bench Machine Learning Clustering +2

Molecular Energy Learning Using Alternative Blackbox Matrix-Matrix Multiplication Algorithm for Exact Gaussian Process

2 code implementations NeurIPS Workshop AI4Scien 2021 Jiace Sun, Lixue Cheng, Thomas F. Miller III

The training of MOB-ML was limited to 220 molecules, and BBMM and AltBBMM scale the training of MOB-ML up by over 30 times to 6500 molecules (more than a million pair energies).

BIG-bench Machine Learning

Informing Geometric Deep Learning with Electronic Interactions to Accelerate Quantum Chemistry

no code implementations31 May 2021 Zhuoran Qiao, Anders S. Christensen, Matthew Welborn, Frederick R. Manby, Anima Anandkumar, Thomas F. Miller III

Predicting electronic energies, densities, and related chemical properties can facilitate the discovery of novel catalysts, medicines, and battery materials.

Multi-task learning for electronic structure to predict and explore molecular potential energy surfaces

no code implementations5 Nov 2020 Zhuoran Qiao, Feizhi Ding, Matthew Welborn, Peter J. Bygrave, Daniel G. A. Smith, Animashree Anandkumar, Frederick R. Manby, Thomas F. Miller III

We refine the OrbNet model to accurately predict energy, forces, and other response properties for molecules using a graph neural-network architecture based on features from low-cost approximated quantum operators in the symmetry-adapted atomic orbital basis.

Multi-Task Learning

OrbNet: Deep Learning for Quantum Chemistry Using Symmetry-Adapted Atomic-Orbital Features

no code implementations15 Jul 2020 Zhuoran Qiao, Matthew Welborn, Animashree Anandkumar, Frederick R. Manby, Thomas F. Miller III

We introduce a machine learning method in which energy solutions from the Schrodinger equation are predicted using symmetry adapted atomic orbitals features and a graph neural-network architecture.

BIG-bench Machine Learning

Regression-clustering for Improved Accuracy and Training Cost with Molecular-Orbital-Based Machine Learning

no code implementations4 Sep 2019 Lixue Cheng, Nikola B. Kovachki, Matthew Welborn, Thomas F. Miller III

Machine learning (ML) in the representation of molecular-orbital-based (MOB) features has been shown to be an accurate and transferable approach to the prediction of post-Hartree-Fock correlation energies.

BIG-bench Machine Learning Clustering +2

A Universal Density Matrix Functional from Molecular Orbital-Based Machine Learning: Transferability across Organic Molecules

no code implementations10 Jan 2019 Lixue Cheng, Matthew Welborn, Anders S. Christensen, Thomas F. Miller III

Finally, a transferability test in which models trained for seven-heavy-atom systems are used to predict energies for thirteen-heavy-atom systems reveals that MOB-ML reaches chemical accuracy with 36-fold fewer training calculations than $\Delta$-ML (140 versus 5000 training calculations).

BIG-bench Machine Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.