2005
DOI: 10.1007/11573548_73
|View full text |Cite
|
Sign up to set email alerts
|

A Multimodal Database as a Background for Emotional Synthesis, Recognition and Training in E-Learning Systems

Abstract: Abstract. This paper presents a multimodal database developed within the EUfunded project MYSELF. The project aims at developing an e-learning platform endowed with affective computing capabilities for the training of relational skills through interactive simulations. The database includes data coming from 34 participants and concerning physiological parameters, vocal nonverbal features, facial expression and posture. Ten different emotions were considered (anger, joy, sadness, fear, contempt, shame, guilt, pr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
1
0

Year Published

2007
2007
2023
2023

Publication Types

Select...
3
2
2

Relationship

1
6

Authors

Journals

citations
Cited by 8 publications
(2 citation statements)
references
References 12 publications
0
1
0
Order By: Relevance
“…The results showed that it was necessary to use event-related features and emphasized the significance of improving the adaptive decision fusion method for emotion classification. Anolli et al [72] developed an E-learning system endowed with emotion computing ability for the cultivation of relational skills based on 10 different kinds of emotions detected from the facial expressions, vocal nonverbal features and postures of 34 subjects. Picard et al [73] developed a machine with the ability to recognize daily eight emotional states from EMG, EDA, blood volume pressure and respiration signals over multiple weeks.…”
Section: Multiple Physiological Signalsmentioning
confidence: 99%
“…The results showed that it was necessary to use event-related features and emphasized the significance of improving the adaptive decision fusion method for emotion classification. Anolli et al [72] developed an E-learning system endowed with emotion computing ability for the cultivation of relational skills based on 10 different kinds of emotions detected from the facial expressions, vocal nonverbal features and postures of 34 subjects. Picard et al [73] developed a machine with the ability to recognize daily eight emotional states from EMG, EDA, blood volume pressure and respiration signals over multiple weeks.…”
Section: Multiple Physiological Signalsmentioning
confidence: 99%
“…All the modules provide information about behavior and physiological variations displayed by the user at a certain stage of the interaction: Variations should be processed by the inferential system in order to assess emotional state using an advanced statistical approach, including different learning and classification algorithms. To this end, it is critically necessary to define a training database, as a fundamental prerequisite for developing reliable classification algorithms (Anolli et al, 2005).…”
Section: Guidelines For Designing An Inferential System Of Emotion Rementioning
confidence: 99%