2018
DOI: 10.1021/acs.jctc.8b00832
|View full text |Cite
|
Sign up to set email alerts
|

Boosting Quantum Machine Learning Models with a Multilevel Combination Technique: Pople Diagrams Revisited

Abstract: Inspired by Pople diagrams popular in quantum chemistry, we introduce a hierarchical scheme, based on the multi-level combination (C) technique, to combine various levels of approximations made when calculating molecular energies within quantum chemistry. When combined with quantum machine learning (QML) models, the resulting CQML model is a generalized unified recursive kernel ridge regression which exploits correlations implicitly encoded in training data comprised of multiple levels in multiple dimensions. … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

4
185
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
10

Relationship

2
8

Authors

Journals

citations
Cited by 105 publications
(189 citation statements)
references
References 86 publications
4
185
0
Order By: Relevance
“…[5][6][7] Further studies demonstrate the advantages of predicting the differences between low-and high-fidelity calculations (i.e., Δ-Learning), [8] or of using multiple levels of fidelity to train the same model. [9,10] It is also possible to link easily-computable properties to the outcomes of extensive calculations, as shown by how Seko et al used calculated bulk moduli as inputs to a model that predicts melting temperature. [11] Neural networks offer more possibilities through the ability to train the same model on multiple properties (e.g., multi-task or transfer learning).…”
Section: Introductionmentioning
confidence: 99%
“…[5][6][7] Further studies demonstrate the advantages of predicting the differences between low-and high-fidelity calculations (i.e., Δ-Learning), [8] or of using multiple levels of fidelity to train the same model. [9,10] It is also possible to link easily-computable properties to the outcomes of extensive calculations, as shown by how Seko et al used calculated bulk moduli as inputs to a model that predicts melting temperature. [11] Neural networks offer more possibilities through the ability to train the same model on multiple properties (e.g., multi-task or transfer learning).…”
Section: Introductionmentioning
confidence: 99%
“…the positions of all the atoms) of a material is known, ML may in principle provide the same information about the material as a DFT calculation would: structural stability, phonon dispersion relations, elastic constants etc. It might even in principle provide data of a better quality than standard (semi-)local DFT calculations, comparable to more advanced DFT calculations with hybrid functionals or even higher-level methods as recently demonstrated for molecules [20].…”
Section: Introductionmentioning
confidence: 84%
“…Computational chemistry is naturally a sub-field that has been increasingly boosted by the advances and unique capabilities of ML (Rupp et al, 2012;Ramakrishnan et al, 2014Ramakrishnan et al, , 2015Dral et al, 2015;Sánchez-Lengeling and Aspuru-Guzik, 2017;Christensen et al, 2019;Iype and Urolagin, 2019;Mezei and Von Lilienfeld, 2019;Zaspel et al, 2019).…”
Section: Improving Computational and Quantum Chemistrymentioning
confidence: 99%