2021
DOI: 10.1038/s41524-021-00609-2
|View full text |Cite
|
Sign up to set email alerts
|

Deep learning framework for material design space exploration using active transfer learning and data augmentation

Abstract: Neural network-based generative models have been actively investigated as an inverse design method for finding novel materials in a vast design space. However, the applicability of conventional generative models is limited because they cannot access data outside the range of training sets. Advanced generative models that were devised to overcome the limitation also suffer from the weak predictive power on the unseen domain. In this study, we propose a deep neural network-based forward design approach that enab… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
52
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
8
2

Relationship

3
7

Authors

Journals

citations
Cited by 101 publications
(59 citation statements)
references
References 54 publications
0
52
0
Order By: Relevance
“…However, this improvement seems to be specific to our test data as predictions on the relaxed validation data were better when the model was trained only on relaxed structures(Figure S2). The CGCNN-HD trained on only relaxed structures under predicted the structures in H-Relaxed likely because the training data contains relatively few transition metal hydrides and the bounded hyperbolic activation function, utilized in the CGCNN-HD's convolutional layers, has poor predictive power on unseen domains [48]. This poor predictive power on unseen domains is also the reason the CGCNN-HD models make poor predictions on the the high and low formation energy structures of the MP data.…”
Section: Resultsmentioning
confidence: 99%
“…However, this improvement seems to be specific to our test data as predictions on the relaxed validation data were better when the model was trained only on relaxed structures(Figure S2). The CGCNN-HD trained on only relaxed structures under predicted the structures in H-Relaxed likely because the training data contains relatively few transition metal hydrides and the bounded hyperbolic activation function, utilized in the CGCNN-HD's convolutional layers, has poor predictive power on unseen domains [48]. This poor predictive power on unseen domains is also the reason the CGCNN-HD models make poor predictions on the the high and low formation energy structures of the MP data.…”
Section: Resultsmentioning
confidence: 99%
“…• The toolkit we developed provides a faster and more flexible structure-properties platform, which expedites the particlebased simulation and materials design procedures. • CuLSM opens the venue for high-throughput and highfidelity data generation to meet the increasing need for machine learning aided materials design protocols (Kim et al, 2021;Sui et al, 2021). • The work largely reduces the computational cost for predicting elasticity and fracture behaviors of complex materials systems, further accelerating the design phase through offering predictive insights for additive manufacturing and mechanical experimentation.…”
Section: Discussionmentioning
confidence: 99%
“…In addition, the ML model must consider adequate design space between training set and test set. For example, if a variable with a higher concentration range than the training set used for model development is input into the developed ML model, the calibration performance has the possibility of deterioration [15]. In other words, the training set must contain a wide enough range of concentration of PM 2.5 .…”
Section: Dataset For Calibration Machine Learning (Ml) Modelingmentioning
confidence: 99%