2022
DOI: 10.1007/s40192-021-00247-y
|View full text |Cite
|
Sign up to set email alerts
|

CrabNet for Explainable Deep Learning in Materials Science: Bridging the Gap Between Academia and Industry

Abstract: Despite recent breakthroughs in deep learning for materials informatics, there exists a disparity between their popularity in academic research and their limited adoption in the industry. A significant contributor to this “interpretability-adoption gap” is the prevalence of black-box models and the lack of built-in methods for model interpretation. While established methods for evaluating model performance exist, an intuitive understanding of the modeling and decision-making processes in models is nonetheless … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(7 citation statements)
references
References 80 publications
0
7
0
Order By: Relevance
“…Third, the use of an attention-based model, in combination with the Robust L2 loss, both leads to superior performance and provides unique advantages. The learned attention weights provide an opportunity to interpret the predictions made for a composition [126], and this could be a useful aspect of using the CraTENet model when analysing individual materials (rather than in bulk, as we have focused on here). Additionally, the Robust L2 loss is especially useful in that it allows the model to learn to quantify the uncertainty arising from mapping the composition (and optionally band gap) to thermoelectric properties.…”
Section: Discussionmentioning
confidence: 99%
“…Third, the use of an attention-based model, in combination with the Robust L2 loss, both leads to superior performance and provides unique advantages. The learned attention weights provide an opportunity to interpret the predictions made for a composition [126], and this could be a useful aspect of using the CraTENet model when analysing individual materials (rather than in bulk, as we have focused on here). Additionally, the Robust L2 loss is especially useful in that it allows the model to learn to quantify the uncertainty arising from mapping the composition (and optionally band gap) to thermoelectric properties.…”
Section: Discussionmentioning
confidence: 99%
“…The symbolic regression model was built using the PySR library, 48 while the ROOST and CrabNet models were implemented using their respective open resources. [50][51][52][53] The Matplotlib 55 and Mpltern 56 tools were employed to plot the ternary plots.…”
Section: Algorithmsmentioning
confidence: 99%
“…We employed XGBoost, 45 LightGBM, 46 ANN, 47 symbolic regression, 48,49 deep representation learning from stoichiometry (ROOST) 50,51 and compositionally restricted attention-based networks (CrabNet) 52,53 to develop predictive models for current densities in the unit of mA cm −2 . Details can be found in the ESI.…”
Section: Algorithmsmentioning
confidence: 99%
See 1 more Smart Citation
“…Common applications of GDL include shape analysis and pose recognition in computer vision, 1 link and community detection on social media networks, 2–4 representation learning on textual graphs, 5,6 medical image analysis for disease detection 7–9 and property prediction for molecular and crystalline materials. 10–18…”
Section: Introductionmentioning
confidence: 99%