2010
DOI: 10.1109/titb.2010.2041553
|View full text |Cite
|
Sign up to set email alerts
|

Toward Emotion Aware Computing: An Integrated Approach Using Multichannel Neurophysiological Recordings and Affective Visual Stimuli

Abstract: This paper proposes a methodology for the robust classification of neurophysiological data into four emotional states collected during passive viewing of emotional evocative pictures selected from the International Affective Picture System. The proposed classification model is formed according to the current neuroscience trends, since it adopts the independency of two emotional dimensions, namely arousal and valence, as dictated by the bidirectional emotion theory, whereas it is gender-specific. A two-step cla… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
132
0
1

Year Published

2014
2014
2019
2019

Publication Types

Select...
4
1
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 215 publications
(133 citation statements)
references
References 44 publications
0
132
0
1
Order By: Relevance
“…Among the reported studies, a few have used only a limited number of electrodes (3 or 4) based on assumptions about brain activity localization such as the frontal lobe lateralization. [115,118,119] The results demonstrate that it is possible to reduce the number of electrodes without suffering from a drastic drop of performance. A method based on synchronization likelihood and anatomical knowledge was proposed in [120] to automatically select electrodes of interest.…”
Section: Number Of Channelsmentioning
confidence: 84%
See 2 more Smart Citations
“…Among the reported studies, a few have used only a limited number of electrodes (3 or 4) based on assumptions about brain activity localization such as the frontal lobe lateralization. [115,118,119] The results demonstrate that it is possible to reduce the number of electrodes without suffering from a drastic drop of performance. A method based on synchronization likelihood and anatomical knowledge was proposed in [120] to automatically select electrodes of interest.…”
Section: Number Of Channelsmentioning
confidence: 84%
“…[113] Focusing on valence and arousal, Soleymani et al [105] argue that the arousal dimension is better discriminated by brain activity than the valence dimension. When looking at the studies which analyzed both the classification of valence and arousal on two classes [105,106,110,114,115] the valence accuracy is only marginally higher than the arousal accuracy (valence mean accuracy is 65.6%, arousal mean accuracy is 68.2%), and it is difficult to conclude any potential advantage of neurophysiological signals for arousal assessment. It is unfortunately difficult to compare valence-arousal results with those obtained with basic emotions due to the difference in the number of classes employed.…”
Section: Assessed Emotionsmentioning
confidence: 99%
See 1 more Smart Citation
“…To prevent such incidents, the authors argue that the emotional stimuli must be subjectively evaluated and categorised prior to the undertaking of physiological measurements, in order to validate their effectiveness in evoking the required emotional experiences on the part of all users. So far various affective stimuli types have been employed in order to perform human emotional experience assessment and classification (summarised in [25], [26], [27]; whilst a smaller number were found to employ other image datasets, such as affective face gestures [28]. The average image presentation duration was reported as 14.5 seconds (±22.88, minimum 1 and maximum 60 seconds).…”
Section: Stimuli and Physiological Measurementsmentioning
confidence: 99%
“…The average stimulation duration was reported as 170 seconds (±75.5, minimum 90 and maximum 240 seconds). [26], [28], [29], [33], [34], [40] [25], [29], [30], [31], [35], [36], [37], [38], [39] [25], [30], [32], [34], [35], [36], [37], [38], [39] [25], [30], [34], [37], [38] [29], [30], [35], [37], [38], [39] …”
mentioning
confidence: 99%