Proceedings of the ACM 2011 Conference on Computer Supported Cooperative Work 2011
DOI: 10.1145/1958824.1958892
|View full text |Cite
|
Sign up to set email alerts
|

See what i'm saying?

Abstract: To create intelligent collaborative systems able to anticipate and react appropriately to users' needs and actions, it is crucial to develop a detailed understanding of the process of collaborative reference. We developed a dyadic eye tracking methodology and metrics for studying the multimodal process of reference, and applied these techniques in an experiment using a naturalistic conversation elicitation task. We found systematic differences in linguistic and visual coordination between pairs of mobile and s… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
5
0

Year Published

2014
2014
2021
2021

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 42 publications
(9 citation statements)
references
References 30 publications
(35 reference statements)
1
5
0
Order By: Relevance
“…This work aimed to understand how shared visual attention of pairs changed over time when workload changed from low to high. The results show there is a positive relationship between gaze overlap and performance over time, supporting previous work (Devlin et al, 2019;Gergle & Clark, 2011;Hajari et al, 2016). It specifically suggests performance is improved when participants substantially increase their shared visual attention on the AOI causing a workload change, (i.e.…”
Section: Discussionsupporting
confidence: 85%
See 1 more Smart Citation
“…This work aimed to understand how shared visual attention of pairs changed over time when workload changed from low to high. The results show there is a positive relationship between gaze overlap and performance over time, supporting previous work (Devlin et al, 2019;Gergle & Clark, 2011;Hajari et al, 2016). It specifically suggests performance is improved when participants substantially increase their shared visual attention on the AOI causing a workload change, (i.e.…”
Section: Discussionsupporting
confidence: 85%
“…For example, one commonly used metric to characterize shared visual attention is percent gaze overlap, which quantifies the amount of time multiple observers are concurrently viewing the same predetermined area of the display (Pietinen, Bednarik, & Tukiainen, 2010). Increases in percent gaze overlap have corresponded with improved performance (Gergle & Clark, 2011;Hajari, Cheng, Zheng, & Basu, 2016), but this is not always the case (e.g. Villamor & Rodrigo, 2018).…”
Section: Introductionmentioning
confidence: 99%
“…The development of dyadic eye-tracking approaches made it possible to study gaze coordination processes during dyadic communication (e.g., "Does Susan's gaze pattern influence John's speech?" see Gergle and Clark 2011;Jermann et al 2011;Richardson et al 2007). Even more importantly, the availability of mobile eye-tracking glasses in the last years now allows for the study of communicational processes in naturalistic and unconstrained environments such as tutoring and small group scaffolding (see Haataja et al 2020) and even more complex one-to-many communication settings as in whole-class instruction.…”
Section: An Emerging Research Paradigmmentioning
confidence: 96%
“…lying cognitive process (Aleven, Rau, & Rummel, 2012;Sharma, Jermann, Nüssli & Dillenbourg, 2012) or dialogues (Gergle & Clark, 2011). In a pair program comprehension study, Sharma et al (2012) showed that gaze patterns of the pair can differentiate between episodes of linear reading and episodes of understanding the program's data flow.…”
Section: Ritpu • Ijthementioning
confidence: 99%
“…In a pair-programming task Sharma et al (2012) and Sharma, Jermann, Nüssli & Dillenbourg (2013); demonstrated that certain dialogue episodes correspond to higher gaze proportions at certain areas on the computer screen. In a collaborative elicitation task on a mobile screen, Gergle & Clark (2011) showed that the movement of mobile partners can act as a coordination mechanism for their explicit deictic gestures.…”
Section: Ritpu • Ijthementioning
confidence: 99%