2022
DOI: 10.1021/acs.jpca.2c03901
|View full text |Cite
|
Sign up to set email alerts
|

Electronic Configurations of 3d Transition-Metal Compounds Using Local Structure and Neural Networks

Abstract: Machine learning (ML) methods extract statistical relationships between inputs and results. When the inputs are solid-state crystal structures, structure−property relationships can be obtained. In this work, we investigate whether a simple neural network is able to learn the 3d orbital occupations for the transition-metal (TM) centers in crystalline inorganic solid-state compounds using only the local structure around the transition-metal centers described by rotationally invariant fingerprints based on spheri… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 66 publications
(90 reference statements)
0
3
0
Order By: Relevance
“…To establish an ML vector, we chose 0, 1 (binary representation/binary notation) rather than atomic numbers as inputs to represent elements. In fact, this encoding method has been widely applied in feature construction for ML models in chemistry, 29,30 and can offer a high accuracy. 31 Besides, this method can offer more rationality for discrete features which are not comparable since using only 0 and 1 can minimize bias for different features, which is its most significant advantage.…”
Section: Feature Constructionmentioning
confidence: 99%
“…To establish an ML vector, we chose 0, 1 (binary representation/binary notation) rather than atomic numbers as inputs to represent elements. In fact, this encoding method has been widely applied in feature construction for ML models in chemistry, 29,30 and can offer a high accuracy. 31 Besides, this method can offer more rationality for discrete features which are not comparable since using only 0 and 1 can minimize bias for different features, which is its most significant advantage.…”
Section: Feature Constructionmentioning
confidence: 99%
“…Other approaches to predict spin state ordering have used interpretable linear models 106 or neural networks trained using only the local structure around the TM centers. 107 While ML has been previously leveraged to predict SCO behavior through predictions of spin splitting energies or structural differences between spin states, in this study we use ML models to make predictions of transition temperatures of SCO complexes from the previously curated SCO-95 data set. 40 We train interpretable ML models, that is, random forests (RFs) on a full feature set as well as RF or kernel ridge regression (KRR) models on a feature set selected by RF-ranked recursive feature addition (RF-RFA) on a set of 76 SCO complexes for which experimental T 1/2 values are available.…”
Section: Introductionmentioning
confidence: 99%
“…Besides predicting SCO behavior through predictions of spin splitting energies, a study from our group demonstrated the utility of ANNs trained on hybrid DFT structures by data-mining the literature for SCO complex structures and correctly assigning almost all spin states based on bond lengths in a set of 46 Fe­(II) SCO complexes. Other approaches to predict spin state ordering have used interpretable linear models or neural networks trained using only the local structure around the TM centers …”
Section: Introductionmentioning
confidence: 99%