2023
DOI: 10.3389/fnhum.2022.1029784
|View full text |Cite
|
Sign up to set email alerts
|

Explainable artificial intelligence model to predict brain states from fNIRS signals

Abstract: Objective: Most Deep Learning (DL) methods for the classification of functional Near-Infrared Spectroscopy (fNIRS) signals do so without explaining which features contribute to the classification of a task or imagery. An explainable artificial intelligence (xAI) system that can decompose the Deep Learning mode’s output onto the input variables for fNIRS signals is described here.Approach: We propose an xAI-fNIRS system that consists of a classification module and an explanation module. The classification modul… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(1 citation statement)
references
References 76 publications
0
1
0
Order By: Relevance
“…An overall accuracy rate exceeding 75% was achieved when investigating whether referring to a product as expensive or inexpensive could influence its perceived value-for-money ( Misawa et al, 2014 ). In mental imagery, an accuracy of over 96% was achieved ( Shibu et al, 2023 ). Moreover, these machine learning models can differentiate between mind wandering and task-related episodes with an accuracy of 73.2% ( Liu et al, 2021 ).…”
Section: Introductionmentioning
confidence: 99%
“…An overall accuracy rate exceeding 75% was achieved when investigating whether referring to a product as expensive or inexpensive could influence its perceived value-for-money ( Misawa et al, 2014 ). In mental imagery, an accuracy of over 96% was achieved ( Shibu et al, 2023 ). Moreover, these machine learning models can differentiate between mind wandering and task-related episodes with an accuracy of 73.2% ( Liu et al, 2021 ).…”
Section: Introductionmentioning
confidence: 99%