2013
DOI: 10.1016/j.imavis.2012.10.002
|View full text |Cite
|
Sign up to set email alerts
|

Fusion of facial expressions and EEG for implicit affective tagging

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
144
2

Year Published

2013
2013
2022
2022

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 197 publications
(152 citation statements)
references
References 21 publications
2
144
2
Order By: Relevance
“…[104] Employing images, sounds, and videos for emotion elicitation is also motivated by affective tagging applications, which consist in automatically assigning tags to multimedia contents. [105,106] In a psychophysiological study of emotion induced by music and film stimuli, Stephens et al [107] replicated the finding of autonomic specific basic emotions and demonstrated that the phenomenon of autonomic nervous system (ANS) specificity of emotion was not a function of the emotion induction technique. In [108], the authors showed that the emotion assessment performance obtained using visual and auditory stimuli is similar.…”
Section: Emotion Elicitationmentioning
confidence: 97%
See 2 more Smart Citations
“…[104] Employing images, sounds, and videos for emotion elicitation is also motivated by affective tagging applications, which consist in automatically assigning tags to multimedia contents. [105,106] In a psychophysiological study of emotion induced by music and film stimuli, Stephens et al [107] replicated the finding of autonomic specific basic emotions and demonstrated that the phenomenon of autonomic nervous system (ANS) specificity of emotion was not a function of the emotion induction technique. In [108], the authors showed that the emotion assessment performance obtained using visual and auditory stimuli is similar.…”
Section: Emotion Elicitationmentioning
confidence: 97%
“…[113] Focusing on valence and arousal, Soleymani et al [105] argue that the arousal dimension is better discriminated by brain activity than the valence dimension. When looking at the studies which analyzed both the classification of valence and arousal on two classes [105,106,110,114,115] the valence accuracy is only marginally higher than the arousal accuracy (valence mean accuracy is 65.6%, arousal mean accuracy is 68.2%), and it is difficult to conclude any potential advantage of neurophysiological signals for arousal assessment. It is unfortunately difficult to compare valence-arousal results with those obtained with basic emotions due to the difference in the number of classes employed.…”
Section: Assessed Emotionsmentioning
confidence: 99%
See 1 more Smart Citation
“…To the best of our knowledge this is the first study with classification results using EEG signals. We demonstrate how EEG signals can be used for tagging on one of the publicly available databases [12,13,14].…”
Section: Introductionmentioning
confidence: 99%
“…Koelstra et al [11] used EEG signals to detect emotional tags for music videos. In a recent study, Koelstra and Patras [14] fused facial expressions analysis and EEG signals to detect two classes of arousal, valence and dominance on MAHNOB-HCI database [12].…”
Section: Introductionmentioning
confidence: 99%