2019
DOI: 10.1109/access.2019.2928691
|View full text |Cite
|
Sign up to set email alerts
|

Interpretable Emotion Recognition Using EEG Signals

Abstract: Electroencephalogram (EEG) signal-based emotion recognition has attracted wide interests in recent years and has been broadly adopted in medical, affective computing, and other relevant fields. However, the majority of the research reported in this field tends to focus on the accuracy of classification whilst neglecting the interpretability of emotion progression. In this paper, we propose a new interpretable emotion recognition approach with the activation mechanism by using machine learning and EEG signals. … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
29
0
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 115 publications
(49 citation statements)
references
References 29 publications
1
29
0
1
Order By: Relevance
“…Several unimodal and multimodal emotion recognition studies have been reported in the literature [12][13][14][15][16]. Majority of studies use non-physiological data, such as audio, video and text [17].…”
Section: Background and Literature Review On Multimodal Emotion Rmentioning
confidence: 99%
“…Several unimodal and multimodal emotion recognition studies have been reported in the literature [12][13][14][15][16]. Majority of studies use non-physiological data, such as audio, video and text [17].…”
Section: Background and Literature Review On Multimodal Emotion Rmentioning
confidence: 99%
“…Many physiological modalities and features have been evaluated for ER, namely Electroencephalography (EEG) [ 28 , 29 , 30 ], Electrocardiography (ECG) [ 31 , 32 , 33 ], Electrodermal Activity (EDA) [ 34 , 35 , 36 ], Respiration (RESP) [ 26 ], Blood Volume Pulse (BVP) [ 26 , 35 ] and Temperature (TEMP) [ 26 ]. Multi-modal approaches have prevailed; however, there is still no clear evidence of which feature combinations and physiological signals are the most relevant.…”
Section: State Of the Artmentioning
confidence: 99%
“…The real-world impact of BCIs is promising because they can identify intention-reflected brain activities. In the past decade, human-centered BCIs, such as those in mental fatigue detection tasks (Binias et al, 2020 ; Ko et al, 2020b ), emotion recognition (Qing et al, 2019 ), and controlling exoskeletons (Lee et al, 2017 ) have shed light on the success of improving human ability. An active BCI (Fahimi et al, 2020 ) recognizes complex patterns from EEG spontaneously caused by a user's intention independent of external stimuli, and a reactive BCI (Won et al, 2019 ) identifies brain activities in reaction to external events.…”
Section: Introductionmentioning
confidence: 99%