2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2016
DOI: 10.1109/icassp.2016.7472180
|View full text |Cite
|
Sign up to set email alerts
|

Agreement and disagreement classification of dyadic interactions using vocal and gestural cues

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
2
1
1

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 17 publications
0
2
0
Order By: Relevance
“…There is an increasing body of literature on detecting different social attitudes such as dominance, agreement, or engagement in interactions (Dermouche and Pelachaud 2019a;Khaki et al 2016;Bee et al 2009). These tackle the interaction from video recordings and could be applicable to VCs on 2D displays.…”
Section: Detecting Social Attitudesmentioning
confidence: 99%
“…There is an increasing body of literature on detecting different social attitudes such as dominance, agreement, or engagement in interactions (Dermouche and Pelachaud 2019a;Khaki et al 2016;Bee et al 2009). These tackle the interaction from video recordings and could be applicable to VCs on 2D displays.…”
Section: Detecting Social Attitudesmentioning
confidence: 99%
“…In our previous study [23], we investigate a multimodal twoclass dyadic interaction type (DIT) estimation approach of agreement and disagreement classes from speech and motion DIT classifier Figure 2: Dyadic Interaction Type -Continuous Emotion Recognition (DIT-CER) system.…”
Section: Dyadic Interaction Type Estimationmentioning
confidence: 99%