2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops 2009
DOI: 10.1109/acii.2009.5349552
|View full text |Cite
|
Sign up to set email alerts
|

PAD-based multimodal affective fusion

Abstract: The study of multimodality is comparatively less developed for Affective interfaces than for their traditional counterparts. However, one condition for the successful development of Affective interface technologies is the development of frameworks for the real-time multimodal fusion. In this paper, we describe an approach to multimodal affective fusion, which relies on a dimensional model, PleasureArousal-Dominance (PAD) to support the fusion of affective modalities, each input modality being represented as a … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
15
0

Year Published

2011
2011
2017
2017

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 22 publications
(15 citation statements)
references
References 42 publications
(38 reference statements)
0
15
0
Order By: Relevance
“…The aggregate user affective experience is then characterised by the trajectory of this vector in 3-D PAD space, over the length of an interactive session ( Figure 1d). This is described in more detail in [12].…”
Section: Pad-based Multimodal Fusionmentioning
confidence: 99%
See 3 more Smart Citations
“…The aggregate user affective experience is then characterised by the trajectory of this vector in 3-D PAD space, over the length of an interactive session ( Figure 1d). This is described in more detail in [12].…”
Section: Pad-based Multimodal Fusionmentioning
confidence: 99%
“…We have previously described a system to study how multimodal affective interfaces could be used to capture user experience, using a variety of unobtrusive channels such as video and speech [12]. Input modalities consist of both user attitudes (bodily movements and posture [17,22], as well as more traditional non-verbal behaviour [14]), and affective content of speech utterances (considering both acoustic parameters and affective interpretation of specific keywords).…”
Section: Rationalementioning
confidence: 99%
See 2 more Smart Citations
“…The proposed fusion algorithm is based on preceding work done by [8] (section 2.3). We generalize this approach by designing a fusion scheme that operates in a userdefined vector space.…”
Section: Algorithmmentioning
confidence: 99%