2020
DOI: 10.26434/chemrxiv.11869026.v1
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Compositionally-Restricted Attention-Based Network for Materials Property Prediction

Abstract: <div>In this paper, we evaluate an attention-based neural network architecture for the prediction of inorganic materials properties given access to nothing but each materials' chemical composition. We demonstrate that this novel application of self-attention for material property predictions strikingly outperforms both statistical and ensemble machine learning methods, as well as a fully-connected neural network.This Compositionally-Restricted Attention-Based network, referred to as CrabNet, is associate… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
10
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
5

Relationship

2
3

Authors

Journals

citations
Cited by 8 publications
(10 citation statements)
references
References 12 publications
(17 reference statements)
0
10
0
Order By: Relevance
“…A pretrained network was selected that provides accurate predictions across many different properties. For this implementation, the CrabNet architecture from Wang et al [16] was chosen. This model provides state-of-theart performance in predicting from elemental information and includes self-attention mechanisms.…”
Section: Methodsmentioning
confidence: 99%
See 3 more Smart Citations
“…A pretrained network was selected that provides accurate predictions across many different properties. For this implementation, the CrabNet architecture from Wang et al [16] was chosen. This model provides state-of-theart performance in predicting from elemental information and includes self-attention mechanisms.…”
Section: Methodsmentioning
confidence: 99%
“…The element vectors are featurized according to each element's properties using traditional mat2vec, oliynyk, jarvis, or magpie embeddings. The fractions undergo an alternating sine and cosine expansion as described by Vaswani et al [19] and implemented by Wang et al [16] in CrabNet. It is important to note that log operations are applied to half of the fractional encodings to better preserve the influence of dopants on the predicted properties.…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…In addition to learning element representations for a general purpose in materials science, such in the case of mat-2vec, DL methods can also learn to relate element characteristics on a material property-specific basis. For example, element embeddings were extracted from the CrabNet and HotCrab models which were reproduced using the supplied model weights and the source code [57,58]. The CrabNet and HotCrab models use mat2vec and onehot-encoded element features as the starting element representations, respectively.…”
Section: Learning Meaningful and Per-property Element Representationsmentioning
confidence: 99%