2024
DOI: 10.1109/jbhi.2024.3390606
|View full text |Cite
|
Sign up to set email alerts
|

Understanding the Role of Self-Attention in a Transformer Model for the Discrimination of SCD From MCI Using Resting-State EEG

Elena Sibilano,
Domenico Buongiorno,
Michael Lassi
et al.

Abstract: The identification of EEG biomarkers to discriminate Subjective Cognitive Decline (SCD) from Mild Cognitive Impairment (MCI) conditions is a complex task which requires great clinical effort and expertise. We exploit the self-attention component of the Transformer architecture to obtain physiological explanations of the model's decisions in the discrimination of 56 SCD and 45 MCI patients using resting-state EEG. Specifically, an interpretability workflow leveraging attention scores and time-frequency analysis… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 72 publications
0
2
0
Order By: Relevance
“…New and more sophisticated types of neural networks (transformers) were developed by enhancing the interconnections between their sub-units to detect more complex patterns in data [72,73]. This technology is evolving rapidly, and we would like to refer the interested reader to the latest literature in this domain.…”
Section: New Machine Learning Technologiesmentioning
confidence: 99%
“…New and more sophisticated types of neural networks (transformers) were developed by enhancing the interconnections between their sub-units to detect more complex patterns in data [72,73]. This technology is evolving rapidly, and we would like to refer the interested reader to the latest literature in this domain.…”
Section: New Machine Learning Technologiesmentioning
confidence: 99%
“…Supervised learning techniques are used in medical image analysis, disease classification, and predictive modeling. 59 Transformer models are used in medical NLP tasks, including language translation, text summarization, and clinical documentation. Later in this review, we have focused on GPT and its importance in medicine and its branches such as neonatology.…”
Section: Machine Learning (Ml)mentioning
confidence: 99%