2022
DOI: 10.21203/rs.3.rs-2042719/v1
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Examining graph neural networks for crystal structures: limitations and opportunities for capturing periodicity

Abstract: Historically, materials informatics has relied on human-designed descriptors of materials structures. In recent years, graph neural networks (GNNs) have been proposed for learning representations of crystal structures from data end-to-end producing vectorial embeddings that are optimized for downstream prediction tasks. However, a systematic scheme is lacking to analyze and understand the limits of GNNs for capturing crystal structures. In this work, we propose to use human-designed descriptors as a bank of hu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

1
7
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(8 citation statements)
references
References 42 publications
1
7
0
Order By: Relevance
“…Compared to BPNNs, MPNNs do not require the manual tuning of descriptors, as the representation of an atom's environment (or the receptive field) is directly learned from atom types and positions during the model training process. Moreover, MPNNs have the advantage of the message-passing scheme (see Section ) that distinguishes them from BPNNs.…”
Section: Discussionmentioning
confidence: 99%
“…Compared to BPNNs, MPNNs do not require the manual tuning of descriptors, as the representation of an atom's environment (or the receptive field) is directly learned from atom types and positions during the model training process. Moreover, MPNNs have the advantage of the message-passing scheme (see Section ) that distinguishes them from BPNNs.…”
Section: Discussionmentioning
confidence: 99%
“…While these methods have become widespread for property prediction, graph convolution updates based only on the local neighborhood may limit the sharing of information related to long-range interactions or extensive properties. Gong et al demonstrated that these models can struggle to learn materials properties reliant on periodicity, including characteristics as simple as primitive cell lattice parameters (Figure 3c) [63]. As a result, while graph-based learning is a high-capacity approach, performance can vary substantially by the target use case.…”
Section: Learning On Periodic Crystal Graphsmentioning
confidence: 99%
“…Various strategies to account for this limitation have been proposed. Gong et al found that if the pooled representation after convolutions was concatenated with human-tuned descriptors, errors could be reduced by 90% for related predictions, including phonon internal energy and heat capacity [63]. Algorithms have attempted to more explicitly account for long-range interactions by modulating convolutions with a mask defined by a local basis of Gaussians and a periodic basis of plane waves [65], employing a unique global pooling scheme that could include additional context such as stoichiometry [66], or constructing additional features from the reciprocal representation of the crystal [67].…”
Section: Learning On Periodic Crystal Graphsmentioning
confidence: 99%
See 2 more Smart Citations