1 code implementation • 27 Mar 2023 • Lixue Cheng, Yu-Qin Chen, Shi-Xin Zhang, Shengyu Zhang
Quantum approximate optimization algorithm (QAOA), one of the most representative quantum-classical hybrid algorithms, is designed to solve combinatorial optimization problems by transforming the discrete optimization problem into a classical optimization problem over continuous circuit parameters.
1 code implementation • 17 Jul 2022 • Lixue Cheng, Jiace Sun, J. Emiliano Deustua, Vignesh C. Bhethanabotla, Thomas F. Miller III
We introduce a novel machine learning strategy, kernel addition Gaussian process regression (KA-GPR), in molecular-orbital-based machine learning (MOB-ML) to learn the total correlation energies of general electronic structure theories for closed- and open-shell systems by introducing a machine learning strategy.
2 code implementations • 31 May 2022 • Jiace Sun, Lixue Cheng, Thomas F. Miller III
To demonstrate the ability of MOB-ML to function as generalized density-matrix functionals for molecular dipole moments and energies of organic molecules, we further apply the proposed MOB-ML approach to train and test the molecules from the QM9 dataset.
2 code implementations • 19 May 2022 • Lixue Cheng, ZiYi Yang, ChangYu Hsieh, Benben Liao, Shengyu Zhang
Directed evolution is a versatile technique in protein engineering that mimics the process of natural selection by iteratively alternating between mutagenesis and screening in order to search for sequences that optimize a given property of interest, such as catalytic activity and binding affinity to a specified target.
no code implementations • 21 Apr 2022 • Lixue Cheng, Jiace Sun, Thomas F. Miller III
The resulting clusters from supervised or unsupervised clustering is further combined with scalable Gaussian process regression (GPR) or linear regression (LR) to learn molecular energies accurately by generating a local regression model in each cluster.
2 code implementations • NeurIPS Workshop AI4Scien 2021 • Jiace Sun, Lixue Cheng, Thomas F. Miller III
The training of MOB-ML was limited to 220 molecules, and BBMM and AltBBMM scale the training of MOB-ML up by over 30 times to 6500 molecules (more than a million pair energies).
no code implementations • 4 Sep 2019 • Lixue Cheng, Nikola B. Kovachki, Matthew Welborn, Thomas F. Miller III
Machine learning (ML) in the representation of molecular-orbital-based (MOB) features has been shown to be an accurate and transferable approach to the prediction of post-Hartree-Fock correlation energies.
no code implementations • 10 Jan 2019 • Lixue Cheng, Matthew Welborn, Anders S. Christensen, Thomas F. Miller III
Finally, a transferability test in which models trained for seven-heavy-atom systems are used to predict energies for thirteen-heavy-atom systems reveals that MOB-ML reaches chemical accuracy with 36-fold fewer training calculations than $\Delta$-ML (140 versus 5000 training calculations).