2023
DOI: 10.1146/annurev-matsci-080921-085947
|View full text |Cite
|
Sign up to set email alerts
|

Representations of Materials for Machine Learning

Abstract: High-throughput data generation methods and machine learning (ML) algorithms have given rise to a new era of computational materials science by learning the relations between composition, structure, and properties and by exploiting such relations for design. However, to build these connections, materials data must be translated into a numerical form, called a representation, that can be processed by an ML model. Data sets in materials science vary in format (ranging from images to spectra), size, and fidelity.… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
3
0

Year Published

2024
2024
2025
2025

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 23 publications
(6 citation statements)
references
References 167 publications
0
3
0
Order By: Relevance
“…For efficient training, these crystal structures must be “represented” in a manner that maximizes the retention of information subject to common invariances and equivariances that should be exhibited by periodic crystals. 72 GPR is then used to fit a probabilistic relationship between the target properties (energies, forces) in the reference data and the descriptors that result from the chosen representation. 69 Once the model is trained, any new configuration (structure) of interest can be represented in the same manner and passed through the model to infer the energies and forces associated with that structure.…”
Section: Machine Learning Interatomic Potentialsmentioning
confidence: 99%
“…For efficient training, these crystal structures must be “represented” in a manner that maximizes the retention of information subject to common invariances and equivariances that should be exhibited by periodic crystals. 72 GPR is then used to fit a probabilistic relationship between the target properties (energies, forces) in the reference data and the descriptors that result from the chosen representation. 69 Once the model is trained, any new configuration (structure) of interest can be represented in the same manner and passed through the model to infer the energies and forces associated with that structure.…”
Section: Machine Learning Interatomic Potentialsmentioning
confidence: 99%
“…23–25 Machine learning based methods have been more recently applied, primarily to molecular design problems, but also to periodic crystal structures. 17,26 Moreover, there have been notable works using machine learning based methods to approximate the evaluation of material properties and behaviors. 8,27 This includes approximating DFT outputs directly for different systems, such as ground-state crystal structures for a variety of applications, such as catalysts.…”
Section: Related Workmentioning
confidence: 99%
“…Previous works have utilized generative adversarial networks (GANs), 12 diffusion models, 13,14 and reinforcement learning (RL), 15,16 in addition to advanced crystal representation schemes for generating crystals. 17,18 However, we identify two major gaps in the existing literature for AI-based material discovery. Firstly, most methods do not incorporate quantum mechanics-based first-principles calculations in the learning model, and instead use ML approximators.…”
Section: Introductionmentioning
confidence: 99%
“…Numerous ML techniques have been proposed to determine the relationship between composition, structure and properties in material design. The variety of current approaches includes descriptor-based representations, feature engineering, and deep learning techniques . Among these, graph neural networks (GNNs) and high-dimensional neural networks (HDNNs) emerge as the two most commonly used approaches for structure representation.…”
Section: Introductionmentioning
confidence: 99%
“…The variety of current approaches includes descriptor-based representations, feature engineering, and deep learning techniques. 10 Among these, graph neural networks (GNNs) 11 and high-dimensional neural networks (HDNNs) 12 emerge as the two most commonly used approaches for structure representation. We provide a brief comparison between these two methods.…”
Section: Introductionmentioning
confidence: 99%