2021
DOI: 10.1007/s12193-021-00364-0
|View full text |Cite
|
Sign up to set email alerts
|

MUMBAI: multi-person, multimodal board game affect and interaction analysis dataset

Abstract: Board games are fertile grounds for the display of social signals, and they provide insights into psychological indicators in multi-person interactions. In this work, we introduce a new dataset collected from four-player board game sessions, recorded via multiple cameras, and containing over 46 hours of visual material. The new MUMBAI dataset is extensively annotated with emotional moments for all game sessions. Additional data comes from personality and game experience questionnaires. Our four-person setup al… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
13
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 22 publications
(16 citation statements)
references
References 68 publications
0
13
0
Order By: Relevance
“…Usually, third-view datasets consist of structured interactions where participants need to follow basic directives which favor spontaneous and fluent interactions. Despite the fact that conversations are the most common interaction structure, there are datasets which aim at fostering specific social signals like leadership, competitiveness, empathy, or affect, and therefore engage the participants in competitive/cooperative scenarios (Hung and Chittaranjan, 2010;Sanchez-Cortes et al, 2012;Rehg et al, 2013;Ringeval et al, 2013;Vella and Paggio, 2013;Bambach et al, 2015;Salter et al, 2015;Edwards et al, 2016;Beyan et al, 2016;Georgakis et al, 2017;Doyran et al, 2021;. Other datasets, instead, record in-the-wild interactions during the so-called cocktail parties (Alameda-Pineda et al, 2016;Cabrera-Quiros et al, 2018) and represent very interesting benchmarks to study group dynamics.…”
Section: Datasetsmentioning
confidence: 99%
See 2 more Smart Citations
“…Usually, third-view datasets consist of structured interactions where participants need to follow basic directives which favor spontaneous and fluent interactions. Despite the fact that conversations are the most common interaction structure, there are datasets which aim at fostering specific social signals like leadership, competitiveness, empathy, or affect, and therefore engage the participants in competitive/cooperative scenarios (Hung and Chittaranjan, 2010;Sanchez-Cortes et al, 2012;Rehg et al, 2013;Ringeval et al, 2013;Vella and Paggio, 2013;Bambach et al, 2015;Salter et al, 2015;Edwards et al, 2016;Beyan et al, 2016;Georgakis et al, 2017;Doyran et al, 2021;. Other datasets, instead, record in-the-wild interactions during the so-called cocktail parties (Alameda-Pineda et al, 2016;Cabrera-Quiros et al, 2018) and represent very interesting benchmarks to study group dynamics.…”
Section: Datasetsmentioning
confidence: 99%
“…The latter is considerably less frequent due to its tedious manual annotation process (McCowan et al, 2005;Douglas-Cowie et al, 2007;McKeown et al, 2010;Lücking et al, 2012;Vella and Paggio, 2013;Vandeventer et al, 2015;Naim et al, 2015;Chou et al, 2017;Paggio and Navarretta, 2017;Cafaro et al, 2017;Joo et al, 2019b;Kossaifi et al, 2019;Chen et al, 2020;Khan et al, 2020;. The most frequent low-level annotations that the datasets provide are the participants' body poses and facial expressions (Douglas-Cowie et al, 2007;Rehg et al, 2013;Bilakhia et al, 2015;Vandeventer et al, 2015;Naim et al, 2015;Edwards et al, 2016;Cafaro et al, 2017;Feng et al, 2017;Georgakis et al, 2017;Paggio and Navarretta, 2017;Bozkurt et al, 2017;Andriluka et al, 2018;von Marcard et al, 2018;Mehta et al, 2018;Lemaignan et al, 2018;Joo et al, 2019b;Kossaifi et al, 2019;Schiphorst et al, 2020;Doyran et al, 2021;. Given their annotation complexity, they are usually automatically retrieved with tools like OpenPose (Cao et al, 2019), and manually fixed or discarded.…”
Section: Datasetsmentioning
confidence: 99%
See 1 more Smart Citation
“…Facial expression analysis and video games have been combined in multiple studies discussing various topics, such as affective gaming [22,27], game personalisation [7], player affect evaluation [24,28] and alternative gameplay mechanisms [25]. Recently, Doyran et al [9] published a rich dataset that enables multi-modal, multi-player affect and interaction analysis through capturing the facial expressions of board game players.…”
Section: Related Workmentioning
confidence: 99%
“…Music works contain rich human emotions, and emotions play an indispensable role in the transmission of musical emotions and understanding and appreciation of music [1][2][3]. With the current development of Internet technology and artificial intelligence, the amount of digital music is growing rapidly.…”
Section: Introductionmentioning
confidence: 99%