2023
DOI: 10.1021/jacs.3c07513
|View full text |Cite
|
Sign up to set email alerts
|

Integrating Explainability into Graph Neural Network Models for the Prediction of X-ray Absorption Spectra

Amir Kotobi,
Kanishka Singh,
Daniel Höche
et al.

Abstract: The use of sophisticated machine learning (ML) models, such as graph neural networks (GNNs), to predict complex molecular properties or all kinds of spectra has grown rapidly. However, ensuring the interpretability of these models' predictions remains a challenge. For example, a rigorous understanding of the predicted X-ray absorption spectrum (XAS) generated by such ML models requires an in-depth investigation of the respective black-box ML model used. Here, this is done for different GNNs based on a comprehe… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(11 citation statements)
references
References 80 publications
0
9
0
Order By: Relevance
“…An inverse problem of elucidating structure from spectra is another important topic where AI is starting to become the center of research. 161,162…”
Section: Spectroscopymentioning
confidence: 99%
“…An inverse problem of elucidating structure from spectra is another important topic where AI is starting to become the center of research. 161,162…”
Section: Spectroscopymentioning
confidence: 99%
“…Benefiting from advances in data science, the machine learning (ML) approach has emerged as an unprecedented tool for automatically handling large data sets. For example, the combined use of attribute extraction and clustering algorithms allows the identification of unanticipated minority phases from over 10 million Co K-edge XANES spectra covering more than 100 LiCoO 2 particles collected in the STXM experiment (Figure b). Beyond the battery community, the field of condensed matter and catalysis has witnessed the significant power of ML in X-ray spectroscopy. ,,, For instance, the direct conversion of XANES data into the radial distribution function (RDF) was accomplished using an artificial neural network (ANN) ML model (Figure c). More recently, the incorporation of explainability into the graph neural network (GNN) architecture not only facilitates accurate predictions of XANES spectra but also enhances our current comprehension of XANES (Figure d).…”
Section: Case Study Of Synchrotron-based X-ray Spectroscopy In Batter...mentioning
confidence: 99%
“…Using this, the authors demonstrated that the resulting network could predict spectra with 90% accuracy of the predicted spectral peak locations being within 1 eV of the expected energy, very comparable to the performance achieved by Rankine and Penfold [92], although this did not specifically take advantage of the message passing framework to encode higher-order information. A similar approach was recently adopted by Kotobi et al [126] in which the authors focused on developing an explainable network. Indeed, using feature attribution the authors were able to quantify the contribution of each atom to peaks in the spectrum, which subsequently could be compared to orbitals involved in the transitions.…”
Section: Molecular Graph Representationsmentioning
confidence: 99%