“…In the domain of AR glasses, this research problem is one of attention management and augmentation in AR interfaces [4]. Attention guidance is relevant in many application areas, such as virtual teleconferences, visual search (e.g.…”
Section: Related Workmentioning
confidence: 99%
“…The omnidirectional attention funnel (OAF) is an animated visual guiding system, in which a flexible tunnel of frames is drawn from the current head position and orientation to the intended position and orientation when facing the target [4]. In a comparison study against audio cuing by naming and selection-box highlighting, the OAF could improve search assistance performance in terms of shorter search times, lower errors, and a lower cognitive load.…”
Section: Approaches For Visual Attention Guidancementioning
Figure 1: We are targeting the problem of attention guiding in assembly and picking tasks (left). To be able to systematically and device independently evaluate own designs (bottom right) and established designs (e.g. arrow, top right), we simulate the application context and the AR device in virtual reality (middle).
ABSTRACTA limiting factor of current smart glasses-based augmented reality (AR) systems is their small field of view. AR assistance systems designed for tasks such as order picking or manual assembly are supposed to guide the visual attention of the user towards the item that is relevant next. This is a challenging task, as the user may initially be in an arbitrary position and orientation relative to the target. As a result of the small field of view, in most cases the target will initially not be covered by the AR display, even if it is visible to the user. This raises the question of how to design attention guiding for such "off-screen gaze" conditions.The central idea put forward in this paper is to display cues for attention guidance in a way that they can still be followed using peripheral vision. While the eyes' focus point is beyond the AR display, certain visual cues presented on the display are still detectable by the human. In addition to that, guidance methods that are adaptive to the eye movements of the user are introduced and evaluated.In the frame of a research project on smart glasses-based assistance systems for a manual assembly station, several attention guiding techniques with and without eye tracking have been designed, implemented and tested. As evaluation method simulated AR in a virtual reality HMD setup was used, which supports a repeatable and highly-controlled experimental design.
“…In the domain of AR glasses, this research problem is one of attention management and augmentation in AR interfaces [4]. Attention guidance is relevant in many application areas, such as virtual teleconferences, visual search (e.g.…”
Section: Related Workmentioning
confidence: 99%
“…The omnidirectional attention funnel (OAF) is an animated visual guiding system, in which a flexible tunnel of frames is drawn from the current head position and orientation to the intended position and orientation when facing the target [4]. In a comparison study against audio cuing by naming and selection-box highlighting, the OAF could improve search assistance performance in terms of shorter search times, lower errors, and a lower cognitive load.…”
Section: Approaches For Visual Attention Guidancementioning
Figure 1: We are targeting the problem of attention guiding in assembly and picking tasks (left). To be able to systematically and device independently evaluate own designs (bottom right) and established designs (e.g. arrow, top right), we simulate the application context and the AR device in virtual reality (middle).
ABSTRACTA limiting factor of current smart glasses-based augmented reality (AR) systems is their small field of view. AR assistance systems designed for tasks such as order picking or manual assembly are supposed to guide the visual attention of the user towards the item that is relevant next. This is a challenging task, as the user may initially be in an arbitrary position and orientation relative to the target. As a result of the small field of view, in most cases the target will initially not be covered by the AR display, even if it is visible to the user. This raises the question of how to design attention guiding for such "off-screen gaze" conditions.The central idea put forward in this paper is to display cues for attention guidance in a way that they can still be followed using peripheral vision. While the eyes' focus point is beyond the AR display, certain visual cues presented on the display are still detectable by the human. In addition to that, guidance methods that are adaptive to the eye movements of the user are introduced and evaluated.In the frame of a research project on smart glasses-based assistance systems for a manual assembly station, several attention guiding techniques with and without eye tracking have been designed, implemented and tested. As evaluation method simulated AR in a virtual reality HMD setup was used, which supports a repeatable and highly-controlled experimental design.
“…Feiner et al [11] used a 3D rubberband line drawn from a screen-fixed label to a possibly offscreen target object or location. Biocca et al developed the "Attention Funnel" [5], a vector tunnel drawn to a target, similar to "tunnel-in-the-sky" aviation cockpit head-up displays, and showed that it reduced search time compared to world-fixed labels or audible cues. Tö nnis and Klinker [35] demonstrated that an egocentrically aligned screen-fixed 3D arrow projected in AR was faster at directing a car driver's attention than an exocentric alternative.…”
Abstract-We explore the development of an experimental augmented reality application that provides benefits to professional mechanics performing maintenance and repair tasks in a field setting. We developed a prototype that supports military mechanics conducting routine maintenance tasks inside an armored vehicle turret, and evaluated it with a user study. Our prototype uses a tracked headworn display to augment a mechanic's natural view with text, labels, arrows, and animated sequences designed to facilitate task comprehension, localization, and execution. A within-subject controlled user study examined professional military mechanics using our system to complete 18 common tasks under field conditions. These tasks included installing and removing fasteners and indicator lights, and connecting cables, all within the cramped interior of an armored personnel carrier turret. An augmented reality condition was tested against two baseline conditions: the same headworn display providing untracked text and graphics and a fixed flat panel display representing an improved version of the laptop-based documentation currently employed in practice. The augmented reality condition allowed mechanics to locate tasks more quickly than when using either baseline, and in some instances, resulted in less overall head movement. A qualitative survey showed that mechanics found the augmented reality condition intuitive and satisfying for the tested sequence of tasks.
“…Feiner, MacIntyre, and Seligmann [10] used a 3D rubberband line drawn from a screen-fixed label to a possibly offscreen target object or location. Biocca and colleagues developed the "Attention Funnel" [5], a vector tunnel drawn to a target, similar to "tunnel-in-the-sky" aviation cockpit head-up displays, and showed that it reduced search time compared to world-fixed labels or audible cues. Tönnis and Klinker [28] demonstrated that an egocentrically aligned screen-fixed 3D arrow projected in AR was faster at directing a car driver's attention than an exocentric alternative.…”
We present the design, implementation, and user testing of a prototype augmented reality application to support military mechanics conducting routine maintenance tasks inside an armored vehicle turret. Our prototype uses a tracked head-worn display to augment a mechanic's natural view with text, labels, arrows, and animated sequences designed to facilitate task comprehension, location, and execution. A within-subject controlled user study examined professional military mechanics using our system to complete 18 common tasks under field conditions. These tasks included installing and removing fasteners and indicator lights, and connecting cables, all within the cramped interior of an armored personnel carrier turret. An augmented reality condition was tested against two baseline conditions: an untracked headworn display with text and graphics and a fixed flat panel display representing an improved version of the laptop-based documentation currently employed in practice. The augmented reality condition allowed mechanics to locate tasks more quickly than when using either baseline, and in some instances, resulted in less overall head movement. A qualitative survey showed mechanics found the augmented reality condition intuitive and satisfying for the tested sequence of tasks.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.