2010
DOI: 10.1007/978-3-642-15314-3_9
|View full text |Cite
|
Sign up to set email alerts
|

Single Trial Classification of EEG and Peripheral Physiological Signals for Recognition of Emotions Induced by Music Videos

Abstract: Abstract. Recently, the field of automatic recognition of users' affective states has gained a great deal of attention. Automatic, implicit recognition of affective states has many applications, ranging from personalized content recommendation to automatic tutoring systems. In this work, we present some promising results of our research in classification of emotions induced by watching music videos. We show robust correlations between users' self-assessments of arousal and valence and the frequency powers of t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

2
107
0
6

Year Published

2011
2011
2023
2023

Publication Types

Select...
5
3

Relationship

3
5

Authors

Journals

citations
Cited by 175 publications
(115 citation statements)
references
References 18 publications
2
107
0
6
Order By: Relevance
“…These music video clips were carefully selected using a subjective test. More information about the selection procedure can be found in [10]. Before displaying each video a 5-second long baseline was recorded.…”
Section: Datasetmentioning
confidence: 99%
“…These music video clips were carefully selected using a subjective test. More information about the selection procedure can be found in [10]. Before displaying each video a 5-second long baseline was recorded.…”
Section: Datasetmentioning
confidence: 99%
“…The users' behavior and spontaneous reactions to multimedia data can provide useful information for multimedia indexing with the following scenarios: (i) direct assessment of tags: users spontaneous reactions will be translated into emotional keywords, e.g., funny, disgusting, scary [18,13,16,19]; (ii) assessing the relevance of explicit tags or topic relevance, e.g., agreement or disagreement over a displayed tag or the relevance of the retrieved result [8,12,20,21]; (iii) user profiling: a user's personal preferences can be detected based on her reactions to retrieved data and be used for re-ranking the results; (iv) content summarization: highlight detection is also possible using implicit feedbacks from the users [22,23].…”
Section: Introductionmentioning
confidence: 99%
“…The existing literature can be divided into two categories, one dealing with using emotional reactions to tag the content with the expressed emotion, e.g., laughter detection for hilarity [5], and the second group of studies using the spontaneous reactions for information retrieval or search results, e.g., eye gaze for relevance feedback [22]. A summary of the recent relevant literature on this topic is given in Table I. There have been also studies using unnimodal or multimodal approaches for detecting the behavioral or emotional responses to multimedia [21], [23], [24], [20], [6]. There is currently a research trend towards estimating emotions from multimedia content automatically [13], [15], [14].…”
Section: State Of the Artmentioning
confidence: 99%
“…Koelstra et al [6] recorded EEG and peripheral physiological signals of six participants in response to music videos. Participants rated their felt emotions by means of arousal, valence and like/dislike rating rating.…”
Section: State Of the Artmentioning
confidence: 99%
See 1 more Smart Citation