2023
DOI: 10.1038/s43588-023-00435-0
|View full text |Cite
|
Sign up to set email alerts
|

Current and future machine learning approaches for modeling atmospheric cluster formation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5

Relationship

3
2

Authors

Journals

citations
Cited by 6 publications
(5 citation statements)
references
References 113 publications
0
4
0
Order By: Relevance
“…In the previous section, we demonstrated that training on the binding energies of equilibrium configurations yields large errors when predicting binding energies for out-of-equilibrium structures. In Figure 6 , we show how an ML model can be tested on its transferability and extrapolation, 54 and we further examine these options in the following sections.…”
Section: Resultsmentioning
confidence: 99%
“…In the previous section, we demonstrated that training on the binding energies of equilibrium configurations yields large errors when predicting binding energies for out-of-equilibrium structures. In Figure 6 , we show how an ML model can be tested on its transferability and extrapolation, 54 and we further examine these options in the following sections.…”
Section: Resultsmentioning
confidence: 99%
“…Since this QC method is computationally quite fast, using ML does not speed up the process. More useful is, for instance, Δ-ML ωB97X‑D∥GFN1‑xTB as used in our previous work on the CS of SA-multibase clusters or even Δ-ML DLPNO∥ωB97X‑D as suggested in our recent perspective . However, as a proof of concept, we will use r 2 SCAN-3c in the next section to show the potential ML speedup by the categorization trick.…”
Section: Application and Discussionmentioning
confidence: 99%
“…The recent explosion of ML utilization in quantum chemistry has shown that ML potentials can mimic potential energy surfaces within the chemical accuracy of QC methods. There are several ML techniques for regression tasks, such as artificial neural networks (NN), Gaussian process regression (GPR), and kernel ridge regression (KRR), each with their own strengths and weaknesses . The first task in creating an ML model is choosing the molecular representation for the studied system.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…39,40 However, to mimic accurate DFT energies, kernel-based ML methods become computationally demanding 40–42 and neural-networks will require an extensive set of training data and hyperparameter optimization. 43 Moreover, ML methods often fail when predicting on structures different from the training set.…”
Section: Introductionmentioning
confidence: 99%