Proceedings of the 20th International Conference on Intelligent User Interfaces 2015
DOI: 10.1145/2678025.2701384
|View full text |Cite
|
Sign up to set email alerts
|

Attention Engagement and Cognitive State Analysis for Augmented Reality Text Display Functions

Abstract: Human eye gaze has recently been used as an effective input interface for wearable displays. In this paper, we propose a gaze-based interaction framework for optical see-through displays. The proposed system can automatically judge whether a user is engaged with virtual content in the display or focused on the real environment and can determine his or her cognitive state. With these analytic capacities, we implement several proactive system functions including adaptive brightness, scrolling, messaging, notific… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2015
2015
2021
2021

Publication Types

Select...
4
3
2

Relationship

2
7

Authors

Journals

citations
Cited by 20 publications
(6 citation statements)
references
References 24 publications
0
6
0
Order By: Relevance
“…Some measure attention using classroom observation, but online aspects of a blended course may be at a distance, making such techniques impractical. Other methods for measuring attention during online instruction track eye movement (Boucheix, Lowe, Putri, & Groff, 2013;Miller, 2015;Toyama, Sonntag, Orlosky, & Kiyokawa, 2015), brainwaves (Sun, 2013), or gross body language (D'Mello et al, 2008). Already intelligent tutoring systems attempt to reengage students when they perceive waning attention (D'Mello et al, 2008), and as understanding of blended and online learner engagement improves, data-rich systems will sense ebbing attention and provide real-time feedback to both learner and instructor (Bienkowski, Feng, & Means, 2012).…”
Section: Cognitive Engagementmentioning
confidence: 99%
“…Some measure attention using classroom observation, but online aspects of a blended course may be at a distance, making such techniques impractical. Other methods for measuring attention during online instruction track eye movement (Boucheix, Lowe, Putri, & Groff, 2013;Miller, 2015;Toyama, Sonntag, Orlosky, & Kiyokawa, 2015), brainwaves (Sun, 2013), or gross body language (D'Mello et al, 2008). Already intelligent tutoring systems attempt to reengage students when they perceive waning attention (D'Mello et al, 2008), and as understanding of blended and online learner engagement improves, data-rich systems will sense ebbing attention and provide real-time feedback to both learner and instructor (Bienkowski, Feng, & Means, 2012).…”
Section: Cognitive Engagementmentioning
confidence: 99%
“…For instance, prior research proposed to use eye tracking and HMDs to augment the episodic memory of dementia patients by storing artificial memory sequences and presenting them when needed [ 39 ]. Other works include approaches for gaze-based analysis of the users’ attention engagement and cognitive states for proactive content visualization [ 40 ], and multi-focal plane interaction, such as object selection and manipulation at multiple fixation distances [ 41 ]. It can also be used in research regarding selection techniques in AR [ 42 , 43 ].…”
Section: Discussionmentioning
confidence: 99%
“…Eye tracking has been recognized as a promising interaction technology for mobile AR systems (see e.g. [10]) and several approaches have demonstrated its applicability for a diverse range of applications [18,28,29,35]. However, today we still do not find any commercially available AR system with eye tracking build-in.…”
Section: Related Workmentioning
confidence: 99%