2021
DOI: 10.1021/acs.jcim.0c01224
|View full text |Cite
|
Sign up to set email alerts
|

Transferable Multilevel Attention Neural Network for Accurate Prediction of Quantum Chemistry Properties via Multitask Learning

Abstract: The development of efficient models for predicting specific properties through machine learning is of great importance for the innovation of chemistry and material science. However, predicting global electronic structure properties like Frontier molecular orbital highest occupied molecular orbital (HOMO) and lowest unoccupied molecular orbital (LUMO) energy levels and their HOMO–LUMO gaps from the small-sized molecule data to larger molecules remains a challenge. Here, we develop a multilevel attention neural … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
54
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 84 publications
(59 citation statements)
references
References 95 publications
(158 reference statements)
0
54
0
Order By: Relevance
“…In short, increasing the number of layers in the first dense block should theoretically improve model accuracy but at an associated computational cost. To measure model sensitivity as a function of the change in the first dense block's depth, DenseNet models with different d1 values (viz., 12,16,20,24) were prepared, keeping the total depth (d1+d2) of the network fixed at 32. The number of dense layers in the second dense block (d2) is varied accordingly to keep the overall depth constant across all the models.…”
Section: Effect Of Varying the Dense Block Configurationmentioning
confidence: 99%
See 1 more Smart Citation
“…In short, increasing the number of layers in the first dense block should theoretically improve model accuracy but at an associated computational cost. To measure model sensitivity as a function of the change in the first dense block's depth, DenseNet models with different d1 values (viz., 12,16,20,24) were prepared, keeping the total depth (d1+d2) of the network fixed at 32. The number of dense layers in the second dense block (d2) is varied accordingly to keep the overall depth constant across all the models.…”
Section: Effect Of Varying the Dense Block Configurationmentioning
confidence: 99%
“…However, the steep computational requirements of highly accurate methods such as CCSD(T) 19 and Gaussian-4, 20 preclude their use on a routine basis. A variety of noteworthy graph-based architectures (viz., SchNet 21 , PhysNet 22 , DimeNet 23 , DeepMoleNet 24 , OrbNet 25 ) have been proposed for the prediction of DFT level (B3LYP/6-31G(2df,p)) energies on the QM9 dataset. 26,27 In this work, however, we aim to predict G4(MP2) level energies, a relatively cheaper alternative to the G4 method, which is typically accurate within 1.0 kcal mol −1 of the experimental value, and hence is a more valuable quantity to reproduce.…”
Section: Introductionmentioning
confidence: 99%
“…These state-of-the-art ML methods have shown impressive performance on benchmark datasets such as the QM9 dataset which contains the ground state properties of molecules consisting of up to 9 non-hydrogen atoms. [42][43][44][45][46][47][48][49][50][51][52][53][54][55] Despite the remarkable progress of these graph ML methods, their application to conjugated long oligomers and polymers remains limited primarily due to the difficulty in obtaining sufficient training data using quantum chemical methods.…”
Section: Introductionmentioning
confidence: 99%
“…Graph neural networks (GNNs) result in state-of-the-art predictions on quantum mechanical properties, physicochemical properties, biological activity and toxicity. [1][2][3][4][5][6][7][8][9][10][11] To fairly evaluate the quality of different methods, Wu et al introduced MoleculeNet as a large-scale benchmark for molecular property prediction. 12 It provides multiple public data sets, data splitting, as well as high-quality implementation of popular algorithms of molecular featurization and learning algorithms.…”
Section: Introductionmentioning
confidence: 99%