DOI: 10.1007/978-3-540-73078-1_36
|View full text |Cite
|
Sign up to set email alerts
|

A User Independent, Biosignal Based, Emotion Recognition Method

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
36
0

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 55 publications
(36 citation statements)
references
References 6 publications
0
36
0
Order By: Relevance
“…For example, Picard et al [8] recorded physiological signals from a single subject (an actor) on multiple days. Conversely, other researchers [6], [32], [33] have attempted to build user-independent models that have the potential to generalize to new users. This is an important step for building practical affect sensitive applications, but yields significant challenges as physiological patterns vary from person to person and from situation to situation.…”
Section: User-independent Versus User-dependent Modelsmentioning
confidence: 99%
See 1 more Smart Citation
“…For example, Picard et al [8] recorded physiological signals from a single subject (an actor) on multiple days. Conversely, other researchers [6], [32], [33] have attempted to build user-independent models that have the potential to generalize to new users. This is an important step for building practical affect sensitive applications, but yields significant challenges as physiological patterns vary from person to person and from situation to situation.…”
Section: User-independent Versus User-dependent Modelsmentioning
confidence: 99%
“…Some studies [6], [33] that attempted user-independent modeling used withinparticipants cross validation where instances from the same subject are in both the training and testing sets. A betweenparticipants validation method (i.e., where the same subject is either in the training or test set but not in both) is more appropriate.…”
Section: User-independent Versus User-dependent Modelsmentioning
confidence: 99%
“…This database has emotional data collected from three males over 25 days using musical emotional induction (Cheng and Liu 2008;Cong and Chetouani 2009;Jonghwa and Ande 2008). Researchers have also used other modes of emotion elicitation like visual stimuli using the international affective picture system (IAPS) database (Rigas et al 2007), audio stimuli using the international affective digital sounds (Stevenson and James 2008), audio-visual movie clips, and emotional recall paradigms to elicit various emotions (André et al 2004;Chanel et al 2009;Fatma et al 2004;Lan and Jihua 2006;Rigas et al 2007;Roberts and Coan 2005). In (Arroyo-Palacious and Romano 2008), EMG signals are used to classify the positive (happy and surprise) and negative (sad, disgust, fear, and anger) emotions.…”
Section: Related Workmentioning
confidence: 99%
“…In [28] the authors propose a method for the recognition of happiness, disgust and fear using input signals such as facial electromyograms, electrocardiogram, the respiration and the electrodermal skin response. The authors of [29] propose yet another emotion recognition system that resorts to electrocardiogram, skin temperature variation and electrodermal activity as input signals.…”
Section: Related Workmentioning
confidence: 99%